0% found this document useful (0 votes)
32 views48 pages

Nazim Lab Session 1-15

Here are the key steps to measure the length of specimens using a vernier caliper: 1. Place the specimen between the jaws of the vernier caliper, holding it securely. 2. Ensure the jaws are closed completely around the specimen without any gaps. 3. Read the measurement where the vernier scale lines up closest to the main scale. For digital vernier calipers, simply read the digital display. 4. Record the measurement in a notebook. 5. Repeat steps 1-4 to measure additional specimens. 6. Compare measurements taken using analogue and digital vernier calipers to check for accuracy and precision. 7. Document any differences

Uploaded by

Hammad Saeed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views48 pages

Nazim Lab Session 1-15

Here are the key steps to measure the length of specimens using a vernier caliper: 1. Place the specimen between the jaws of the vernier caliper, holding it securely. 2. Ensure the jaws are closed completely around the specimen without any gaps. 3. Read the measurement where the vernier scale lines up closest to the main scale. For digital vernier calipers, simply read the digital display. 4. Record the measurement in a notebook. 5. Repeat steps 1-4 to measure additional specimens. 6. Compare measurements taken using analogue and digital vernier calipers to check for accuracy and precision. 7. Document any differences

Uploaded by

Hammad Saeed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 48

Lab Layout

1
Lab Session 01

Table 1
1. Mechanical Compactor
2. Steel Rule
3. Ring Gauge
4. Plug Gauge
5. Thread Gauge
Table 2
6. Angle Gauge
7. Feeler Gauge
8. Wire Gauge
9. Outside Caliper
10.Inside Caliper
Table 3
11.Digital Vernier Caliper
12.Dial Indicator Stand
13.Dial Indicator
Table 4
14.Surface Plate
15.Digital Micrometer
16.Slip Gauge Set
Table 5
17.Sine Bar
18.Depth Micrometer
19.Universal Bevel Protector
Table 6
20.Dial Bore Gauge
21.Vernier Depth Gauge
22.Vernier Height Gauge

2
Apparatus Overview
Angle Gauge:
An angle gauge is a measuring tool used to determine the angle of a surface
or object relative to a reference plane. It typically consists of a flat base
with a protractor or circular scale attached, along with an adjustable arm or
blade that can be set to a specific angle. Angle gauges may be analog or
digital and can be used in a variety of applications, such as woodworking,
metalworking, engineering, and construction. They are commonly used to
ensure precision and accuracy in the fabrication and assembly of objects and
structures where specific angles are critical to their function.

Angle Gauges
Wire Gauge:
Wire gauge refers to the measurement system used to determine the size or
diameter of a wire. It is typically represented by a numerical value, with
smaller numbers indicating a larger diameter wire. The wire gauge system is
widely used in electrical and electronic industries to determine the appropriate
wire size for a specific application, based on factors such as the amount of
current flowing through the wire, the length of the wire, and the voltage drop
that can be tolerated. Wire gauge can also refer to a tool used to measure the
diameter of a wire, which may be in the form of a circular or flat blade with Wire Gauge
notches or markings that correspond to the various gauge sizes.

Inside Caliper:
An inside caliper is a measuring tool used to determine the internal
dimensions of an object or space, such as the diameter of a hole or the
distance between two opposing surfaces. It typically consists of two
legs, one fixed and one adjustable, that are joined at a hinge and can be
opened and closed by turning a screw or adjusting a nut. The legs are
designed to be inserted into a hole or opening, and the distance between
them can be measured using a ruler or other measuring device. Inside
calipers come in a variety of sizes and shapes, with different leg lengths
and tip configurations to suit different applications. They are commonly
used in machining, woodworking, metalworking, and other precision Digital Vernier Caliper
manufacturing processes to ensure accurate measurements and
dimensions.

Outside Caliper:
An outside caliper is a measuring tool used to determine the external
dimensions of an object or space, such as the diameter of a rod or the width
of a groove. It typically consists of two legs, one fixed and one adjustable,
that are joined at a hinge and can be opened and closed by turning a screw
or adju sting a nut. The legs are designed to be placed around the object
being measured, and the distance between them can be measured using a
ruler or other measuring device. Outside calipers come in a variety of sizes
Outside Caliper
and shapes, with different leg lengths and tip configurations to suit
3
different applications. They are commonly used in machining, metalworking, woodworking, and other
precision manufacturing processes to ensure accurate measurements and dimensions.

Digital Vernier Caliper:


A digital vernier caliper, also known as a digital
caliper, is a measuring tool used to determine the
precise dimensions of an object. It consists of a
set of jaws that can be opened and closed to grip
an object, along with a digital display that
provides an accurate readout of the
measurement. The jaws are typically designed
with both internal and external measuring
surfaces, allowing them to measure both the Digital Vernier Caliper
inside and outside dimensions of an object.
Digital vernier calipers are extremely accurate and precise, with the ability to measure to within
thousandths of an inch or hundredths of a millimeter. They are commonly used in a variety of
industries, including manufacturing, machining, engineering, and woodworking, as well as in scientific
and medical applications where precise measurements are critical.

Depth Micrometer:
A depth micrometer is a precision measuring tool
used to measure the depth of holes, slots, and other
recessed features in an object. It typically consists of a
calibrated scale attached to a micrometer head that
can be adjusted up or down by turning a screw. The
micrometer head is attached to a long, thin probe that
can be inserted into the recessed feature being
measured. The probe may be straight or angled,
depending on the application. Depth micrometers Depth Micrometer
come in a variety of sizes and types, including analog
and digital models, and are commonly used in manufacturing, engineering, and scientific research to
ensure precise depth measurements. They are especially useful in applications where small, precise
measurements are required, such as in the manufacture of miniature parts or electronic components.

Slip Gauge Set:


A slip gauge set, also known as a gauge block set or Jo
block set, is a precision measuring tool consisting of a set
of rectangular metal blocks, usually made of steel or
ceramic, of varying sizes and thicknesses. These blocks
are designed to be used as reference standards to calibrate
and check the accuracy of other measuring instruments,
Slip Gauge Set
such as micrometers, height gauges, and dial indicators.
Each block is precisely ground to a specific dimension,
often to within a few millionths of an inch, and is marked with its nominal size or measurement value.
Slip gauge sets may also include accessories such as clamps, handles, and adjusting screws to aid in
their use. They are commonly used in engineering,
manufacturing, and research settings where precise and
accurate measurements are critical.

Digital Micrometer:
Digital Micrometer
A digital micrometer is a precision measuring tool used to
4
determine the dimensions of an object or space with high accuracy and precision. It typically consists of
a C-shaped frame with a measuring rod or spindle attached to a thimble that can be rotated to adjust the
position of the spindle. A digital readout display is incorporated into the micrometer to display the
measurement value, which is typically given in thousandths of an inch or hundredths of a millimeter.
The spindle is brought into contact with the object being measured, and the measurement is taken by
reading the display on the micrometer. Digital micrometers are commonly used in manufacturing,
engineering, and laboratory settings where precise and accurate measurements are critical. They offer
advantages over traditional analog micrometers, such as increased speed, ease of use, and higher levels
of accuracy and precision.

Universal Bevel Protector


A Universal Bevel Protector is a tool used to measure and
transfer angles in woodworking, metalworking, and other
precision manufacturing applications. It typically consists of
two arms that are joined at a pivot point and can be adjusted
to a specific angle using a locking mechanism or thumb
screw. The arms may be graduated with a scale or protractor,
allowing for precise measurement of angles, and they may
also be marked with angle presets for commonly used
angles. The Universal Bevel Protector is designed to be used
in conjunction with other tools, such as saws or drills, to ensure accurate cutting or drilling angles. It is
commonly used by carpenters, machinists, and metalworkers to create precise angles and shapes in their
work.

Dial Bore Gauge:


A dial bore gauge is a precision measuring instrument used
to determine the internal diameter of a hole or cylindrical
bore. It consists of a long cylindrical body with a
measuring probe at one end and a dial indicator at the
other. The probe is inserted into the bore and the indicator
displays the measurement on a dial or digital readout. The
gauge typically has interchangeable measuring heads of
different sizes to accommodate a wide range of bore
Dial Bore Gauge
diameters. Dial bore gauges are commonly used in
manufacturing and machining operations, such as engine building, to ensure the accuracy of bores and
the proper fit of pistons or other components. They can also be used in quality control inspections to
verify the dimensional accuracy of manufactured parts.

5
Lab Session 02
Experiment 01
Objective:
To measure the Length of different specimens using Analogue and Digital Vernier Caliper.

Apparatus:
 Digital Vernier Caliper.
 Analogue Vernier Caliper.
 Cylindrical Specimen
 Magnifying Glass
 Notebook and pen to record measurements

Diagram:

6
Procedure:
1. Get the specimen close to the Vernier caliper jaws.
2. Hold it diameter wise between the jaws of the Vernier calipers tightly.
3. Now see the position of the zero of the Vernier against the main scale.
4. Note the reading on main scale and
5. Note which division of Vernier scale coincides with main scale division
6. Use the following expression to find diameter. Diameter= M.S. + V.S. X L.C.
7. Repeat the procedure 1 to 5 at least four/five times by changing the position of the jaws.
8. Apply zero error to find the corrected diameter
9. The digital Vernier gives the direct reading on the screen.
1. Wipe off any corrosion, dirt or oil from measuring surfaces and slide.
2. Contamination of surfaces may cause measurement error.
3. Allow the caliper and object to be measured time to reach the same temperature.
4. A temperature difference between the object to be measured and the caliper may cause
measurement error.

Theory:
The experiment to measure the length of different specimens using analogue and digital Vernier
calipers involves using both types of calipers to measure the length of various specimens. The
specimens can be of different shapes and sizes, such as rods, blocks, or cylinders. The purpose of
the experiment is to compare the accuracy and precision of the analogue and digital Vernier
calipers and to determine which type of caliper is more suitable for different applications.
To conduct the experiment, first, the analogue Vernier caliper is used to measure the length of each
specimen by aligning the jaws of the caliper with the edges of the specimen and reading the scale.
The same procedure is then repeated using the digital Vernier caliper. The readings obtained from
both types of calipers are recorded and compared.
The smallest distance that can be measured along the distance is the least count. It is the difference
between one main scale division and one vernier scale division.
The accuracy and precision of the measurements can be evaluated by calculating the mean,
standard deviation, and coefficient of variation for each set of readings. The results can be used to
determine the advantages and limitations of both types of calipers and to identify the applications
for which each type is most suitable.
In conclusion, the experiment to measure the length of different specimens using analogue and
digital Vernier calipers is a useful exercise for understanding the principles of dimensional
measurement and the characteristics of different types of calipers. It can help students and
professionals in engineering, manufacturing, and other fields to select the appropriate tools for their
applications and to ensure the accuracy and quality of their work.

7
length measurement

In Vernier Callipers, n divisions of the vernier scale coincide with (n-1) divisions of the main scale.

n V.S.D = (n-1) M.S.D

Formula Used:
Least count of Vernier Calliper

Corrected Diameter = Mean Observed Diameter – Zero Error

Tables:

Sr.# Thread
Length
(M.S)
Vernier Scale
[
R L.L. X
V .S. R ] Length
[R.M.S]
Digital
Reading
Error %

(mm) (mm) (mm)


(mm)
1
2
2
4
5

Comments:

 When using the Vernier calipers, make sure to align the jaws with the edges of the specimen
and apply a gentle pressure to ensure a precise measurement.

 Repeat the measurements several times to increase the accuracy and reliability of the
results.

8
 Use statistical tools such as mean, standard deviation, and coefficient of variation to analyze
the data and compare the performance of the analogue and digital calipers.

Lab Session 03
Experiment 02
Objective:
To measure the diameter of different specimens using Analogue and Digital Vernier Caliper.

Apparatus:
 Digital Vernier Caliper.
 Analogue Vernier Caliper.
 Cylindrical Specimen
 Magnifying Glass
 Notebook and pen to record measurements

9
Diagram:

Procedure:
1. Get the specimen close to the Vernier caliper jaws.
2. Hold it diameter wise between the jaws of the Vernier calipers tightly.
3. Now see the Value on the screen of the caliper.
4. Note the reading from the screen.
5. Justify the zero error by adding or subtracting through the Vernier reading.
6. Repeat the procedure 1 to 5 at least four/five times by changing the position of the jaws.
7. Apply zero error to find the corrected diameter
8. The digital Vernier gives the direct reading on the screen.

Theory:
The experiment to measure the length of different specimens using analogue and digital Vernier
calipers involves using both types of calipers to measure the length of various specimens. The
specimens can be of different shapes and sizes, such as rods, blocks, or cylinders. The purpose of
the experiment is to compare the accuracy and precision of the analogue and digital Vernier
calipers and to determine which type of caliper is more suitable for different applications.
To conduct the experiment, first, the analogue Vernier caliper is used to measure the length of each
specimen by aligning the jaws of the caliper with the edges of the specimen and reading the scale.
The same procedure is then repeated using the digital Vernier caliper. The readings obtained from
both types of calipers are recorded and compared.
The smallest distance that can be measured along the distance is the least count. It is the difference
between one main scale division and one vernier scale division.
The accuracy and precision of the measurements can be evaluated by calculating the mean,
standard deviation, and coefficient of variation for each set of readings. The results can be used to
determine the advantages and limitations of both types of calipers and to identify the applications
for which each type is most suitable.
In conclusion, the experiment to measure the length of different specimens using analogue and
digital Vernier calipers is a useful exercise for understanding the principles of dimensional
measurement and the characteristics of different types of calipers. It can help students and
professionals in engineering, manufacturing, and other fields to select the appropriate tools for their
applications and to ensure the accuracy and quality of their work.

10
diameter measurement

In Vernier Callipers, n divisions of the vernier scale coincide with (n-1) divisions of the main scale.

n V.S.D = (n-1) M.S.D

Formula Used:
Least count of Vernier Calliper

Corrected Diameter = Mean Observed Diameter – Zero Erro

Tables:

Sr.# Thread
Length
(M.S)
Vernier Scale
[
R L.L. X
V .S. R ] Length
[R.M.S]
Digital
Reading
Error %

(mm) (mm) (mm)


(mm)
1
2
2
4
5

11
Comments:

 Clean the surface of the specimen: Ensure that the surface of the cylindrical specimen is
clean and free from any dirt, oil, or debris. This will prevent errors in the measurement due
to obstruction or slipping of the caliper jaws.

 Align the jaws of the caliper: Open the jaws of the caliper and carefully align them with the
diameter of the cylindrical specimen. Make sure the jaws are perpendicular to the surface of
the specimen.

 Take multiple measurements: Take several measurements using both the analogue and
digital Vernier calipers to ensure the accuracy of the readings. Record the readings in a table
for comparison and analysis.

Lab Session 04
Experiment 03
Objective:
To attain the required length, width and thickness with the help of Block/Slip Gauges.

Apparatus:

 Magnifying Glass
 Notebook and pen to record measurements
 Block/Slip Gauges

Theory:
Slip Gauges are known as Gauge Blocks. They are precise measuring instruments. These Slip
Gauges are universally accepted end Standard of length. Johnson first introduced the slip gauges, so
they can be called as Johnson gauges as well. And he is a Swedish Engineer.
12
Construction of Slip Gauge / Gauge Blocks / Johnson gauges:
Slip gauges are rectangular in shape made up of high-grade steels with very close tolerances. The
Working faces of any slip gauge are made truly flat and parallel. The slip gauges undergo
(<https://en.wikipedia.org/wiki/Hardening>)Hardening to resist wear and tear. They are further
heated and cooled down successively to remove the hardening stresses induced during the
hardening Process. The Slip Gauges can be made up of Tungsten Carbide because of it is extremely
capable of hard and wear resistance. The size of the slip gauges is permanently marked on any of
the measuring faces of individual slip gauge.
Making up a dimension With Slip Gauges
The Slip gauges come in several sizes. The sizes will be marked on the measuring faces. See the
Slip gauges Box.

Slip/Block Gauges

Wringing of Slip Gauges:


 a. Before using slip gauges, the faces should be cleaned
 b. Slide the one slip gauge over the other (With 90° as shown in the Fig: Pos2) with little
pressure. This way we can expel the air between the slip gauges faces.
 c. Once placed the one gauge 90° to another gauge by using little pressure, then rotate it by
clockwise to make them in line as shown in below figure. (fourth
position) 

Above Figure :Wringing Process


 d. Wringing helps to achieve a dimension by summation of the individual size of slip gauge.
The need of clamping is also avoided.
 e. To make any dimension with slip gauges, we will Wring set of slip gauges to achieve the
true dimension. See the following Example.
13
Uses of Slip Gauge
 Setting up a comparator to a specific dimension
 Direct Precise measuring purpose.
 To inspect the Vernier Caliper, Micrometers and some other linear measuring instruments.
 Conjunction with sine bar to measure the angle of the workpiece.
 Used to Check the distance between the parallel faces.

Procedure:
1. Select the slip gauge blocks that you need to create the required measurement standard.
2. Choose blocks of the correct size and thickness and stack them together in the correct.
3. Use a slip gauge holder or clamp to hold the slip gauge blocks securely in place on the surface
plate.
4. Gently slide the surface making sure it is in contact with each block in turn.
Adjust the position of the slip gauge blocks as necessary to achieve the desired measurement

Tables:

Sr.# Required G1 G2 G3 G4 G5
Dimensions (mm) (mm) (mm) (mm) (mm)

2
2

14
Comments:
Slip gauges are the accurate thickness gauges that can be used to calibrate the measuring
instruments. The measurements of slip gauges were taken through Vernier micrometer and were
found to be very precise and accurate depending upon its least count. There were small errors in the
measurements. Possible errors in the measurements could arise due to several reasons:

 Parallax Error
 Environmental factors
 Weather conditions
 Temperature variations
 Thermal expansion
  Dust particles and oil between the surfaces.

Lab Session 05
Experiment 04
Objective:
To measure the diameter of different samples of the wire using a wire gauge

Apparatus:

 Magnifying Glass
 Notebook and pen to record measurements
 wire gauge
 wire samples

15
Procedure:
1. Obtain the wire samples: Collect wire samples of different diameters to be measured using a
wire gauge.
2. Clean the wire samples: Ensure that the wire samples are clean and free from any dirt, oil,
or debris. This will prevent errors in the measurement due to obstruction or slipping of the
wire gauge.
3. Cut the wire samples: Use a wire cutter or scissors to cut the wire samples to a length of
about 10 cm each.
4. Record the length of the wire samples: Use a ruler or measuring tape to measure and record
the length of each wire sample in a table or spreadsheet.
5. Use the wire gauge: Hold the wire sample against the wire gauge and carefully insert it into
the smallest notch or hole that the wire can fit into. The notch or hole should be snug but not
too tight.
6. Read the measurement: Look at the number on the wire gauge that corresponds to the notch
or hole into which the wire has been inserted. This number indicates the diameter of the
wire sample.
7. Repeat the process: Repeat the process for each wire sample to obtain multiple
measurements. Record the readings in a table for comparison and analysis.
8. Calculate the average diameter: Calculate the average diameter of each wire sample by
adding all the measurements and dividing the sum by the number of measurements taken.
This will provide a more precise measurement of the diameter.
9. Check for consistency: Check for consistency between the measurements obtained for each
wire sample. If there is a significant difference between the readings, recheck the insertion
of the wire into the gauge and take additional measurements.

Theory:
The diameter of a wire is an important parameter that determines its strength, electrical
conductivity, and other properties. Wire gauges are commonly used to measure the diameter of
wires, as they provide a simple and accurate way to determine the thickness of wires.
Wire gauges are typically made of metal or plastic and consist of a series of notches or holes of
various sizes, each labeled with a number that corresponds to a specific diameter range. To measure
the diameter of a wire using a wire gauge, the wire is inserted into the smallest notch or hole that it
can fit into. The number on the wire gauge that corresponds to the notch or hole into which the wire
has been inserted indicates the diameter of the wire sample.

16
how to use wire gauge

In this experiment, different samples of wire with varying diameters are measured using a wire
gauge. The wire gauge is selected based on the diameter range of the wire samples being measured.
The wire samples are first cleaned to ensure that they are free from any dirt, oil, or debris that could
affect the accuracy of the measurements. The wire is then cut to the appropriate length using a wire
cutter or scissors, and the length is measured using a ruler or measuring tape.
The wire is inserted into the smallest notch or hole on the wire gauge that it can fit into, and the
number on the wire gauge that corresponds to the notch or hole is recorded as the diameter of the
wire sample. Multiple measurements are taken for each wire sample to ensure the accuracy and
consistency of the readings. The readings obtained from the wire gauge are recorded in a table or
spreadsheet, and calculations such as averages and standard deviations are performed to analyze the
data.

wire gauge

Overall, this experiment provides a simple and effective way to measure the diameter of different
samples of wire using a wire gauge. It is important to ensure that all apparatus are clean and that
multiple measurements are taken to ensure the accuracy of the readings.

Table:
Sr.# Specimen Gauge Reading Reading
(mm) (inch)

1 Wire1

2 Wire2

17
2 Wire3

4 Wire4

Comments:

 Clean the wire: Ensure that the wire samples are clean and free from any dirt, oil, or debris.
This will prevent errors in the measurement due to obstruction or slipping of the wire gauge.

 Insert the wire into the gauge properly: Hold the wire sample against the wire gauge and
carefully insert it into the smallest notch or hole that the wire can fit into. The notch or hole
should be snug but not too tight.

 Take multiple readings: Take multiple readings for each wire sample to ensure the accuracy
and consistency of the measurements. Recording the readings in a table or spreadsheet will
help with data analysis.

 Check for consistency: Check for consistency between the measurements obtained for each
wire sample. If there is a significant difference between the readings, recheck the insertion
of the wire into the gauge and take additional measurements.

Lab Session 06
Experiment 05
Objective:
To measure the pitch of different bolts using thread pitch ( Leaf Gauge).

Apparatus:
1. Thread pitch (leaf) gauge
2. Different bolts with varying thread pitches
3. Cleaning cloth or brush for cleaning bolts and leaf gauge

18
Procedure:
1. Select the appropriate leaf gauge: Choose a leaf gauge that matches the range of thread
pitches of the bolts being measured. Leaf gauges typically consist of a series of leaves of
various thicknesses, each labeled with a range of thread pitches.
2. Clean the bolt: Ensure that the bolt is clean and free from any dirt, oil, or debris that could
affect the accuracy of the measurements.
3. Align the gauge: Hold the leaf gauge against the bolt, ensuring that the gauge is aligned
with the threads.
4. Insert the gauge: Insert the thinnest leaf that fits between two adjacent threads. If the leaf
does not fit, try the next size up until a leaf that fits snugly is found.
5. Record the reading: Note the label or number on the leaf gauge that corresponds to the
thickness of the leaf used to measure the thread pitch. This number indicates the thread
pitch of the bolt.
6. Repeat the process: Repeat the process for each bolt to be measured, taking care to clean the
bolt and leaf gauge between measurements.
7. Analyze the data: Record the readings obtained in a table or spreadsheet, and calculate the
average and standard deviation of the thread pitches.

how to use leaaf gauge

Theory:
The pitch of a bolt is defined as the distance between two adjacent threads. It is an important
parameter to determine the strength and performance of a bolt. Leaf gauges are commonly used to
measure the pitch of bolts, which provides a simple and accurate way to determine the distance
between threads.

Thread pitch (leaf) gauge

Leaf gauges consist of a series of leaves of various thicknesses, each labeled with a range of thread
pitches. To measure the pitch of different bolts using a leaf gauge, the thinnest leaf that fits snugly
between two adjacent threads is inserted into the gauge, and the label or number on the leaf gauge

19
corresponding to the thickness of the leaf is noted. This number indicates the thread pitch of the
bolt. Multiple measurements are taken to ensure the accuracy and consistency of the readings.

By measuring the pitch of different bolts using a leaf gauge, it is possible to determine the thread
pitch of bolts accurately, which is essential for selecting the appropriate bolts for various
applications. It is important to ensure that the bolt and leaf gauge are clean, and multiple
measurements are taken to ensure the accuracy and consistency of the readings.

Table:

Sr.# Pitch Dimensions Standard size


(mm) (mm)

Comments:

 Clean the bolt and leaf gauge thoroughly to ensure accurate readings. Use a cleaning cloth
or brush to remove any dirt, oil, or debris from the surfaces.
 Hold the leaf gauge against the bolt, ensuring that the gauge is aligned with the threads.
 Insert the thinnest leaf that fits snugly between two adjacent threads.
 Note the label or number on the leaf gauge corresponding to the thickness of the leaf used to
measure the thread pitch.
 Repeat the process for each bolt to be measured, taking care to clean the bolt and leaf gauge
between measurements.
20
 Take multiple measurements for each bolt to ensure the accuracy and consistency of the
readings.

Lab Session 07
Experiment 06
Objective:
To measure the diameter of different specimen with the help of inside and outside caliper

Apparatus:
 Specimens to measure
 Outside calipers
 Inside calipers
 A meter ruler

21
 Paper and pen to record measurements

Procedure:
1. Gather all materials and make sure they are clean and dry.
2. Select the appropriate caliper (inside or outside) depending on the specimen to be measured.
The inside caliper is used to measure the inside diameter of a specimen, while the outside
caliper is used to measure the outside diameter.
3. Open the caliper jaws so that they are wider than the diameter of the specimen.
4. Place the caliper jaws onto the specimen so that they are perpendicular to the axis of the
specimen and on opposite sides.
5. Adjust the caliper jaws so that they are snug against the specimen, but not too tight that they
are deforming the specimen.
6. Look at the caliper scale and record the measurement to the nearest hundredth of a
millimeter.
7. Repeat the process for each specimen, using the appropriate caliper for each one.

Outside & Inside Calipers

Theory:
Inside and outside calipers are a type of measuring instrument used to measure the dimensions of
objects with high precision. They are commonly used in engineering, metalworking, woodworking,
and other precision industries.

Inside calipers consist of two legs with one leg having a pointed end and the other having a flat end.
The pointed end is inserted into the object to be measured, and the flat end is adjusted until it is
snug against the opposite side of the object. The measurement can then be read from the scale on
the instrument.

22
Inside calipers

Outside calipers, on the other hand, have two legs with both ends being pointed. The legs are
placed on either side of the object to be measured, and the distance between the points is adjusted
until they are snug against the outer surface. The measurement can then be read from the scale on
the instrument.

Outside Caliper

The precision of inside and outside calipers is determined by the distance between the points of the
legs and the accuracy of the scale. The points of the legs should be sharp and well-defined to ensure
an accurate measurement. The accuracy of the scale is usually determined by the number of
divisions on the scale and the quality of the instrument's manufacturing.

In addition to traditional manual calipers, there are also digital calipers that use electronic sensors
to measure the distance between the legs. Digital calipers provide faster and more accurate readings
and are often used in manufacturing and inspection processes.

Overall, inside and outside calipers are versatile and widely used measuring instruments that
provide precise measurements of an object's dimensions. They are essential tools for many
precision industries and are a crucial component of quality control processes.
Table:
Sr. No. Specimen Inside Diameter Outside Diameter
(mm) (Inch) (mm) (Inch)
1 Specimen 1 7.3 2.92 9.4 3.75
2 Specimen 2 2.3 1.91 10.5 4.2
3 Specimen 3 1.1 0.45 3.8 1.5
4 Specimen 4 7.4 2.90 9.1 3.56
23
5 Specimen 5 1.2 0.40 1.8 0.75
6 Specimen 6 7.6 4.15 10.6 3.0

Comments:

 It is important to ensure that the inside and outside calipers are properly calibrated before
taking measurements. This can be done by using a calibrated gauge block or reference
standard.

 The diameter of the specimen should be measured at several locations to ensure accuracy
and consistency of the measurements.

 The pressure applied to the legs of the calipers should be consistent across all measurements
to prevent errors due to variations in pressure.

 When measuring the inside diameter of a specimen, care should be taken to ensure that the
legs of the inside caliper are perpendicular to the axis of the specimen to avoid errors due to
oblique measurements.

 The outside caliper should be placed around the specimen at a distance from the end equal
to the length of the jaws to ensure that the measurement is taken at the widest point of the
specimen.

 The jaws of the calipers should be cleaned and inspected for damage before and after each
use to prevent errors due to wear or debris.

 When taking measurements, the observer should avoid parallax errors by positioning their
line of sight perpendicular to the scale of the caliper and avoiding any oblique viewing
angles

Lab Session 08
Experiment 07
Objective:
To measure the Clearance/Gap of different specimens with the help of a feeler gauge.

Apparatus:
 Feeler gauge set
 Specimens
 Worktable or workbench
 Cleaning cloth

24
Procedure:
1. Gather the different specimens to be measured and the feeler gauge set.
2. Identify the appropriate feeler gauge blade that corresponds to the expected clearance/gap of
the specimen.
3. Select the blade and insert it between the two surfaces of the specimen where the
clearance/gap is expected.
4. Gently slide the blade back and forth to ensure that it is not binding or catching on any
irregularities in the surface.
5. Once the blade can slide freely between the surfaces, remove it and measure its thickness
using a micrometer or vernier caliper.
6. Record the thickness of the blade as the clearance/gap measurement for that specific
specimen.
7. Repeat steps 3-6 for each specimen and for different expected clearances/gaps as needed.
8. Compare the measured clearances/gaps with the expected values and calculate the
percentage error if necessary.
9. Clean and store the feeler gauge set and other equipment.

Checking sparkplug gap with feeler gauge


feeler gauge

25
Theory:
A feeler gauge is a measuring tool used to measure the clearance or gap between two parallel
surfaces. It consists of a set of thin, flat metal blades of varying thicknesses that are stacked
together and held in a holder. Each blade is stamped with its thickness measurement in both metric
and imperial units.
Feeler gauges are commonly used in automotive and mechanical engineering applications, where
accurate measurements of small gaps or clearances are important for proper operation and
performance of the machinery or equipment. They are also used in quality control and inspection
processes to ensure that manufactured parts meet the required specifications.

adjust tappets with a feeler gauge

To use a feeler gauge, the appropriate blade is selected based on the expected clearance or gap size.
The blade is then inserted between the two surfaces being measured and gently slid back and forth
to ensure it is not binding or catching on any irregularities in the surface. Once the blade can slide
freely between the surfaces, it is removed, and the thickness of the blade is measured using a
micrometer or vernier caliper. The thickness of the blade is then recorded as the clearance or gap
measurement for that specific specimen.

Feeler Gauge uses

Feeler gauges can be made of different materials, including stainless steel, brass, or plastic. They
may be sold as individual blades or as a set, with a range of thicknesses from as thin as 0.02 mm to
as thick as 1.00 mm or more. Feeler gauges should be stored in a protective case to prevent damage
and corrosion, and should be handled with care to prevent bending or damage to the blades, which
can affect the accuracy of the measurement.

26
Table:

Sr. Specimen Gauge 1 Gauge 2 Gauge 3 Actual


No. Readings

(mm) (mm) (mm) (mm)


1 Specimen 1 0.28 0.05 0.01 0.33
2 Specimen 2 0.28 0.20 0.15 0.63
3 Specimen 3 0.28 0.45 0.10 0.65
4 Specimen 4 0.12 0.28 0.006 0.306
5 Specimen 5 0.15 0.28 0.50 0.93

Comments:
 The feeler gauge should be of high quality and calibrated to ensure accuracy of the
measurements.
 The selected blade should be smaller than the expected clearance/gap to ensure that it can fit
between the surfaces but not be too loose to affect the measurement.
 The blade should be inserted parallel to the surfaces being measured to prevent errors due to
oblique measurements.
 The blade should be handled carefully to prevent bending or damage, which can affect the
accuracy of the measurement.
 It is recommended to take multiple measurements at different locations and angles to ensure
consistency and accuracy of the results.
 The micrometer or vernier caliper used to measure the blade thickness should also be
calibrated and handled with care to prevent errors.
 The percentage error of the measurements should be calculated and reported to indicate the
level of accuracy of the experiment.

27
Lab Session 09
Experiment 08
Objective:
To measure the diameter of samples by using plug gauge.

Apparatus:
 Plug gauge
 Samples
 Flat surface
 Cleaning supplies
 Lighting
 Ruler or Vernier caliper

Procedure:
1. Obtain the plug gauge: A plug gauge is a cylindrical tool with a specific diameter that is
used to measure the size of a hole or the diameter of a cylindrical object. Obtain a plug
gauge with a diameter that is slightly smaller than the diameter of the sample you want to
measure.
2. Clean the sample: Clean the sample to remove any dirt, debris, or other contaminants that
could affect the accuracy of the measurement.
3. Place the sample on a flat surface: Place the sample on a flat surface to ensure that it is
stable and level. This will help to ensure that the measurement is accurate.
4. Insert the plug gauge: Insert the plug gauge into the hole or onto the surface of the sample.
Apply light pressure to ensure that the gauge is fully inserted and that there is no gap
between the gauge and the sample.
5. Measure the fit: If the gauge fits snugly into the hole or onto the surface of the sample, then
the diameter of the sample is smaller than the diameter of the plug gauge. If there is a gap
between the gauge and the sample, then the diameter of the sample is larger than the
diameter of the plug gauge.
6. Repeat the process: Repeat the process using plug gauges with different diameters until you
find the gauge that fits snugly into the hole or onto the surface of the sample.
7. Record the measurement: Record the diameter of the plug gauge that fits snugly into the
hole or onto the surface of the sample. This is the diameter of the sample.

Plug Gauge

28
Theory:
A plug gauge, also known as a pin gauge, is a precision measuring instrument used to measure the
diameter of a hole or the width of a slot. It is a cylindrical gauge made of tool steel or tungsten
carbide that has a specific diameter that is ground and lapped to very precise tolerances. The gauge
is designed to be inserted into the hole or slot to determine whether the dimension of the hole or
slot is within the acceptable range.
Plug gauges work on the principle of go and no-go, which means that there are two types of
gauges: go and no-go gauges. A go gauge is designed to check if the hole or slot is within the
acceptable tolerance range, while a no-go gauge is used to check if the dimension is too small. If
the go gauge fits into the hole or slot and the no-go gauge does not fit, then the hole or slot is within
the acceptable tolerance range.

Plug gauges are commonly used in manufacturing industries where precision and accuracy are
crucial, such as the automotive, aerospace, and medical device industries. They are essential for
ensuring that the products meet the required specifications and function properly. Additionally,
plug gauges require very little maintenance and have a long lifespan, making them a cost-effective
measuring instrument.

Limitations: Although plug gauges are very precise measuring instruments, they do have some
limitations. For example, they are only able to measure the diameter of a hole or slot, and they are
not suitable for measuring the thickness of a flat surface. Additionally, they are limited by their
tolerance range, which means that they may not be able to detect very small variations in the
diameter of the hole or slot.

Plug Gauges

29
Table:

Sample Internal Diameter Tolerance

(mm) (mm)
1 M5 × 0.8 6H

2 M10 × 0.6 6H

3 M12 × 0.7 7H

4 M14 × 0.8 8H

Comments:
 Plug gauge selection: It is essential to select a plug gauge with a diameter that is slightly
smaller than the diameter of the sample to obtain accurate measurements.
 Material compatibility: The plug gauge material should be compatible with the sample
material to prevent any damage or deformation to the sample.
 Sample preparation: The sample should be cleaned and dried to ensure that it is free from
any debris, oil, or contaminants that could affect the measurement.
 Gauge insertion: The gauge should be inserted into the sample with gentle pressure to avoid
any deformation or damage to the sample.
 Gauge wear: Plug gauges may wear over time, resulting in inaccurate measurements. It is
necessary to inspect the gauge periodically and replace it when it shows signs of wear.
 Measurement error: The accuracy of the measurement may depend on the skill and
experience of the operator. Therefore, it is essential to train the operator properly to obtain
consistent and accurate measurements.
 Calibration: It is recommended to calibrate the plug gauge periodically to ensure that it
meets the required standards and provides accurate measurements. Calibration can be done
using a reference standard or a calibration laboratory.

30
Lab Session 10
Experiment 09
Objective:
To measure the diameter of different specimen using ring gauge.

Apparatus:
 Ring gauges of various sizes
 Specimens to measure
 Flat surface
 Writing utensil

Procedure:

1. Collect the specimens that need to be measured and ensure they are clean and dry.

2. Choose the appropriate ring gauge that has a diameter slightly smaller than the smallest
diameter of the specimen you want to measure.

3. Place the specimen on a flat surface, such as a table or bench, to ensure that the
measurement is accurate.

4. Carefully slide the ring gauge over the widest part of the specimen until it is snug but not
too tight.

5. Look at the ring gauge and note the size of the ring that fits snugly around the specimen.
This is the diameter of the specimen.

6. Record the diameter measurement for the specimen in a table or spreadsheet.

7. Repeat steps 4-6 for each specimen that needs to be measured.

8. Analyze your results by comparing the diameter measurements of each specimen.

Ring Gauge
31
Theory:
Ring gauges are precision measuring tools used to measure the diameter of round objects. They are
commonly used in manufacturing, engineering, and metalworking to ensure that machined parts are
within tolerance and meet design specifications. A ring gauge consists of a circular metal ring with
a precisely machined internal diameter, and it is designed to be slightly smaller than the diameter of
the object being measured.

When a ring gauge is placed around an object, it should fit snugly without any play or wobbling. If
the ring gauge is too loose, it indicates that the object is larger than the gauge size, and if it is too
tight, it indicates that the object is smaller than the gauge size. By using a range of different-sized
ring gauges, it is possible to determine the precise diameter of an object with a high degree of
accuracy.

Ring gauges come in various shapes and sizes, depending on the application and the types of
objects being measured. They can be made from a variety of materials, including steel, tungsten
carbide, and ceramic. Steel ring gauges are the most common, and they are typically made from
high-quality tool steel that is hardened and ground to a precise tolerance.

Ring Gauge

32
Ring gauges can be used to measure the diameter of a wide range of round objects, including pipes,
shafts, pins, and bearings. They are commonly used in conjunction with other measuring tools, such
as micrometers and vernier calipers, to ensure that machined parts meet exacting specifications.
Ring gauges are precision measuring tools that are essential in the manufacturing and metalworking
industries. By using a range of different-sized ring gauges, it is possible to determine the precise
diameter of an object with a high degree of accuracy, ensuring that machined parts are within
tolerance and meet design specifications.

Table:

Sample External Diameter Tolerance

(mm) (mm)
1 M5 × 0.8 6G

2 M7 × 0.8 6G

3 M8 × 0.6 7G

4 M10 × 0.7 8G

5 M13 × 0.8 8G

Comments:

 Handle the specimens carefully to avoid damaging them or causing inaccuracies in your
measurements.
 Use a flat surface to ensure that the specimen is level and the measurement is accurate.
 Use a clean, dry ring gauge to avoid introducing any errors or contamination.
 Make sure the ring gauge is snug but not too tight, as this can affect the accuracy of the
measurement.
 If the diameter of the specimen varies along its length, measure the diameter at several
points and take the average.
 Repeat the measurement several times for each specimen to ensure accuracy and
consistency.

33
Lab Session 11
Experiment 10
Objective:
To measure the depth of different specimens using Depth Micrometer.

Apparatus:

 Depth Micrometer
 Specimens of uniform thickness
 Calipers (for measuring the thickness of the specimens)
 Spreadsheet or data table (for recording measurements)
 Pen or pencil (for recording data)
 Ruler (for measuring the size of the specimens)

Procedure:

1. Gather the necessary equipment: You'll need a depth micrometer, the specimen(s) you want
to measure, and a clean, flat surface to work on.
2. Check the zero point: Before taking any measurements, make sure the depth micrometer is
properly zeroed. This means that the micrometer should read zero when the measuring faces
are brought into contact with each other. If the micrometer is not zeroed, adjust it until it
reads zero.
3. Place the specimen: Place the specimen on the flat surface with the area you want to
measure facing up.
4. Position the depth micrometer: Hold the depth micrometer perpendicular to the surface of
the specimen and position the measuring faces on Depth Micrometer
5. top of the area you want to measure. Gently press down on the micrometer until the
measuring faces touch the surface of the specimen.
6. Read the measurement: Look at the scale on the depth micrometer to determine the depth
measurement. Some depth micrometers may have digital displays that make it easier to read
the measurement.
7. Record the measurement: Write down the depth measurement and any other relevant
information about the specimen, such as its name or identification number.
8. Repeat the process: If you need to measure other areas of the specimen or other specimens,
repeat steps 3-6 for each measurement.
9. Clean the micrometer: After you've finished measuring, make sure to clean the depth
micrometer to prevent any buildup of dirt or debris that could affect future measurements

34
Depth Micrometer Diagram

Theory:
A depth micrometer is a precision measuring tool used to measure the depth of small holes, slots,
and other recesses that are difficult to measure with other instruments. It consists of a spindle, a
thimble, a measuring rod, and an anvil. The spindle is a long, thin rod that is inserted into the hole
or recess to be measured. The thimble is a rotating sleeve that surrounds the spindle and has
graduations marked on its surface. The measuring rod connects the spindle to the anvil, which is a
flat surface that rests on the material being measured.

Depth Micrometer

When the spindle is inserted into the hole or recess, the anvil is placed on the surface of the
material. The user then rotates the thimble until it comes into contact with the top of the measuring
rod. The distance between the anvil and the spindle can then be read off the graduations on the
thimble, providing an accurate measurement of the depth of the hole or recess.
Depth micrometers are available in a range of sizes and styles to suit different applications. Some
models have digital displays to provide easy-to-read measurements, while others use analog
displays with a rotating thimble. Some depth micrometers also have interchangeable measuring
rods to accommodate different depths and shapes of holes and recesses.

Reading the Metric Micrometer

In order to obtain accurate and reliable measurements with a depth micrometer, it is important to
ensure that the instrument is properly calibrated and that the measuring surfaces are clean and free
from debris. The spindle should be inserted straight into the hole or recess to avoid errors due to
angle or tilt, and the measuring force should be consistent for each measurement.
35
Depth micrometers are widely used in manufacturing, quality control, and research applications
where precise depth measurements are required. They are particularly useful for measuring the
depth of small holes, grooves, and slots in metal, plastic, and other materials where other measuring
tools may not be suitable.

Table:

Sr. MSC CSR Total Reading


No.
(mm) (mm) (mm)
1 18 11 ×0.01 18.1

2 23 20 ×0.01 23.20

3 7 24 ×0.01 7.24

4 23 44 ×0.01 23.44

5 15 35 ×0.01 15.35

Comments:

 Ensure proper calibration of the Depth Micrometer before starting the experiment to ensure
accurate measurement of the depth of the specimens.
 Select specimens of uniform thickness to obtain reliable measurements. Careful selection
and preparation of specimens can prevent errors and variations in the data.
 Apply the Depth Micrometer perpendicular to the surface of the specimen to avoid errors
due to angle or tilt. Also, apply it with a consistent amount of force to ensure uniformity in
measurements.
 Repeat measurements several times and take an average of the values obtained to ensure
accuracy.
 Record all measurements in a clear and organized manner using a spreadsheet or data table.
Include specimen details such as specimen ID, date, and experimenter name.
 Perform statistical analysis to determine the mean, standard deviation, and other relevant
statistical parameters of the data. This can help identify any outliers or inconsistencies in the
data.
 Report the results of the experiment clearly and concisely, along with any relevant tables,
graphs, or charts. Include any limitations or assumptions made during the experiment.

36
Lab Session 12
Experiment 11
Objective:
To measure the depth of different specimen using Vernier Depth Gauge.

Apparatus:

 Vernier Depth Gauge


 Specimens of different depths
 Ruler
 Pen or pencil
 Spreadsheet or data table (optional)

Procedure:

1. Ensure the Vernier Depth Gauge is properly calibrated before use. Follow the
manufacturer's instructions for calibration or consult a calibration guide.
2. Select the first specimen to be measured and place it on a flat surface.
3. Adjust the Vernier Depth Gauge so that the jaws are wider than the specimen.
4. Hold the Vernier Depth Gauge with your thumb and forefinger and lower the jaws onto the
specimen.
5. Gently close the jaws until they touch the bottom of the specimen. Apply consistent
pressure to the Vernier Depth Gauge to avoid variations in the readings.
6. Read the Vernier Depth Gauge scale by aligning the zero mark on the Vernier scale with the
corresponding mark on the main scale. The depth is indicated by the number on the Vernier
scale that lines up with the main scale reading.
7. Record the measurement in a data table or spreadsheet. Note the specimen ID, date,
experimenter name, and the measured depth. You can also record additional information
such as the thickness of the specimen and any observations or notes.
8. Repeat steps 2-7 for each specimen to be measured.
9. Take multiple measurements for each specimen to ensure accuracy and precision. Record
all measurements and calculate the mean and standard deviation.
10. Analyze the data using statistical analysis software, if available, to determine any trends or
significant differences between the specimens.
11. Report the results of the experiment, including the mean and standard deviation of the
measured depths, any observations or notes, and any statistical analysis performed.

Vernier Depth Gauge

37
Theory:
The Vernier Depth Gauge is a precision measuring instrument used to measure the depth of holes,
slots, and other recesses in objects. The gauge consists of a fixed main scale and a sliding Vernier
scale, which moves along the main scale to provide accurate and precise depth measurements.
The main scale of the Vernier Depth Gauge is typically graduated in millimeters or inches, and is
marked with divisions representing one unit of measurement. The Vernier scale is also graduated in
the same units as the main scale, but has a finer resolution, typically one-tenth or one-fiftieth of a
unit. The Vernier scale is attached to a sliding jaw that can be adjusted to fit the recess or depth
being measured.

The least count of a Vernier Depth Gauge is determined by the resolution of the Vernier scale,
which is the smallest increment that can be measured with the gauge. The least count is calculated
by dividing the smallest graduation on the main scale by the number of divisions on the Vernier
scale.
For example, if the main scale is graduated in millimeters and has divisions of 1 mm, and the
Vernier scale has 10 divisions that span the same distance as 9 divisions on the main scale, then the
least count of the Vernier Depth Gauge is calculated as follows:
Least count = (1 mm / 10) - (1 mm / 9) = 0.1 mm / 9 = 0.011 mm
This means that the Vernier Depth Gauge can measure depths with an accuracy of 0.011 mm.
To take a measurement using a Vernier Depth Gauge, the sliding jaw is placed in contact with the
bottom of the recess or hole being measured, and the Vernier scale is moved until its zero point
lines up with the nearest graduation on the main scale. The reading is then taken from the Vernier
scale, where the line on the Vernier scale that lines up with a line on the main scale provides the
fractional part of the measurement. The total measurement is obtained by adding the main scale
reading to the fractional part indicated by the Vernier scale.

A Digital VDG in Laboratory

38
The Vernier Depth Gauge is a precision measuring instrument used to measure the depth of
recesses and holes in objects. The least count of the gauge is determined by the resolution of the
Vernier scale, and the accuracy and precision of the measurement depend on the skill and technique
of the operator, as well as the quality and condition of the gauge.

Table:

Sr. MSC VSR Total Reading


No.
(mm) (mm) (mm)
1 7 4 ×0.02 7.08

2 19 2 ×0.02 19.04

3 41 6 ×0.02 41.12

4 61 8 ×0.02 61.16

5 150 17 ×0.02 150.34

Comments:

 Understand the principles of operation of the Vernier Depth Gauge and how to interpret its
readings. Familiarize yourself with the main scale, Vernier scale, and resolution of the
gauge.
 Use the appropriate units of measurement for the depth of the specimens. Common units
include millimeters (mm) and inches (in).
 When taking measurements, make sure the Vernier Depth Gauge is perpendicular to the
surface of the specimen to avoid errors due to angle or tilt.
 Avoid over-tightening the jaws of the Vernier Depth Gauge when taking measurements.
This can cause damage to the gauge or the specimen.
 Avoid any contaminants or debris that may affect the accuracy of the measurements. Clean
the Vernier Depth Gauge and specimens before use if necessary.
 If possible, use a digital Vernier Depth Gauge for increased precision and accuracy. Digital
gauges typically have a higher resolution and can provide measurements in different units.
 Follow any additional instructions or precautions provided by the manufacturer for the
specific Vernier Depth Gauge model being used.

39
Lab Session 13
Experiment 12
Objective:
To measure the angle of different specimens by using sine bar and surface-plate apparatus.

Apparatus:

 Sine bar
 Surface plate
 Angle gauge block
 Vernier height gauge
 Specimens with flat surfaces
 Cleaning cloth

Procedure:

1. Clean the surface plate using a cleaning cloth to remove any dust or debris.
2. Place the surface plate on a level surface.
3. Place the sine bar on the surface plate with the two precision ground cylinders facing up.
4. Adjust the height of the sine bar using the Vernier height gauge to ensure that it is level.
5. Place the angle gauge block on the precision ground cylinders of the sine bar.
6. Adjust the position of the angle gauge block until it is perpendicular to the sine bar.
7. Place the specimen to be measured on the angle gauge block, with the flat surface in contact
with the block.
8. Adjust the position of the specimen until it is resting against the sine bar.
9. Measure the height of the specimen using the Vernier height gauge, making sure that the
gauge is perpendicular to the surface of the specimen.
10. Calculate the angle of the specimen using the formula:
angle = arctan(height/distance)
where height is the height of the specimen measured in step 9, and distance is the distance between
the two precision ground cylinders of the sine bar, which is known.
11. Repeat steps 7-10 for each specimen to be measured.
Note: The accuracy of the measurement depends on the precision of the sine bar and surface plate,
as well as the skill and technique of the operator.

40
Theory:
The sine bar and surface plate apparatus is used to measure the angle of a specimen with a high
degree of accuracy. The apparatus consists of a sine bar, which is a precision ground bar with two
accurately machined cylindrical surfaces that are perpendicular to the axis of the bar, and a surface
plate, which is a flat, level plate used as a reference surface.
The sine bar works on the principle of trigonometry, where the angle of the specimen can be
calculated by measuring the height of the specimen and the distance between the two precision
ground cylinders of the sine bar. The angle is given by the formula:
Angle = arctan(height/distance)
The accuracy of the measurement depends on the precision of the sine bar and surface plate. The
least count of the sine bar is the smallest angle that can be measured with the instrument, and it is
determined by the distance between the two precision ground cylinders. For example, if the
distance between the two cylinders is 100 mm, the least count of the sine bar is given by:
Least Count = 1/100 * 360 = 3.6 arcseconds
The surface plate is used as a reference surface for the sine bar and should be flat, level, and free of
any dirt or debris that could affect the measurements. The least count of the Vernier height gauge
used to measure the height of the specimen also affects the accuracy of the measurement. The
Vernier height gauge is typically accurate to 0.01 mm, which is the smallest height that can be
measured with the instrument.
To increase the accuracy and precision of the measurements, it is important to take multiple
measurements for each specimen and average the results. It is also important to record all
measurements, calculations, and relevant information, such as the type of specimen and the position
of the measurement, to ensure reproducibility and accuracy of the experiment.

41
Table:

Sr. Specmen Angle(θ)

No.

Comments:

 Make sure that the sine bar and surface plate are made of high-quality materials and have a
high degree of flatness to ensure accuracy in the measurements.
 Before taking measurements, ensure that the surface plate is clean and free of any dirt or
debris that could interfere with the measurements.
 Use a precision height gauge with an accuracy of at least 0.01 mm to measure the height of
the specimen.
 Ensure that the height gauge is properly calibrated before taking measurements.
 Take multiple measurements for each specimen and average the results to increase the
precision and accuracy of the measurements.
 Use a precision angle gauge block to ensure that the angle of the specimen is perpendicular
to the sine bar.
 Record all measurements, calculations, and relevant information, such as the type of
specimen and the position of the measurement, to ensure reproducibility and accuracy of the
experiment.

42
Lab Session 14
Experiment 13
Objective:
To measure the Hight of different specimen using Vernier Sine Gauge.

Apparatus:

 Vernier Sine Gauge


 Specimens of different heights
 Surface plate

Procedure:

1. Clean the surface plate and the Vernier Sine Gauge with a soft cloth to ensure they are free
of any debris or dirt.

2. Place the surface plate on a level surface.

3. Select a specimen and place it on the surface plate.

4. Place the Vernier Sine Gauge on the surface plate and adjust the height of the gauge so that
the base of the gauge is in contact with the surface plate and the measuring pin is in contact
with the top of the specimen.

5. Adjust the angle of the gauge to the desired angle of measurement using the protractor on
the gauge.

6. Read the height measurement from the Vernier scale and record the value.

7. Repeat the measurement for the same specimen several times to ensure accuracy and
calculate the average value.

8. Repeat the measurement for each specimen and record the values.

43
Theory:
A Vernier Sine Gauge is a precision instrument used to measure the height of an object with a high
degree of accuracy. The instrument works on the principle of trigonometry, where the height of the
object is calculated using the sine of the angle of inclination.
The Vernier Sine Gauge consists of a base plate, a measuring pin, a Vernier scale, and a protractor.
The base plate is placed on a flat surface, and the object whose height is to be measured is placed
under the measuring pin. The protractor is used to adjust the angle of inclination of the measuring
pin.
The measurement is taken by reading the Vernier scale, which consists of a sliding scale that is
moved relative to the main scale. The Vernier scale has a smaller division than the main scale,
which allows for more precise measurements.
The least count of the Vernier Sine Gauge is the smallest distance that can be measured with the
instrument, and it is determined by the smallest division on the Vernier scale. For example, if the
smallest division on the Vernier scale is 0.01 mm and there are 10 divisions on the main scale, the
least count of the Vernier Sine Gauge is given by:
Least Count = (1/10) x (0.01 mm) = 0.001 mm
To increase the accuracy of the measurement, it is important to ensure that the base plate is placed
on a flat surface and that the measuring pin is in contact with the object being measured at a single
point. The angle of inclination should also be adjusted carefully using the protractor to ensure that
the measurement is taken at the correct angle.
It is also important to repeat the measurement several times and calculate the average value to
increase the precision of the measurement. All measurements should be recorded, including the
type of object being measured and the position of the measurement, to ensure reproducibility and
accuracy of the experiment.
In summary, the Vernier Sine Gauge is a precision instrument used to measure the height of an
object based on the angle of inclination. The least count of the instrument is determined by the
smallest division on the Vernier scale, and to ensure accuracy and precision, it is important to
carefully adjust the angle of inclination and repeat the measurement several times.

44
Table:

Sr. Specimens Hight Hight

No. (mm) (Inch)

1 4.5+46×0.02 1.75+16×0.01

2 5.7+7×0.02 2.25+47×0.01

3 15.2+28×0.02 6+5×0.01

4 15.05+19×0.02 5.9+20×0.01

5 3.7+49×0.02 1.45+7×0.01

Comments:

 Use a Vernier Sine Gauge with a high accuracy and precision to ensure accurate
measurements.
 Make sure the surface plate is clean and free of any dirt or debris that could affect the
measurements.
 Adjust the Vernier Sine Gauge so that the base of the gauge is in contact with the surface
plate and the measuring pin is in contact with the top of the specimen to ensure accurate
measurements.
 Repeat measurements for each specimen multiple times and calculate the average to
increase the precision of the measurements.
 Use a high-precision Vernier caliper or micrometer to measure the thickness of the
specimens before the experiment to ensure accurate measurements.
 Record all measurements, including the type of specimen and the position of the
measurement, to ensure reproducibility and accuracy of the experiment.
 Make sure the Vernier Sine Gauge is properly calibrated before conducting the experiment.

45
Lab Session 15
Experiment 14
Objective:
To measure the length and diameter of different specimen using Digital Micrometer.

Apparatus:

 Digital Micrometer
 Specimens of varying lengths and diameters

Procedure:

1. Turn on the Digital Micrometer and wait for it to calibrate. Check that the display shows
"0.000" or "0.00" (depending on the resolution of the Micrometer) with no objects placed
between the measuring faces.
2. Select the desired units of measurement (millimeters or inches) by pressing the appropriate
button on the Micrometer.
3. Place the specimen to be measured between the measuring faces of the Micrometer. For
length measurements, position the specimen with one end touching the fixed jaw and the
other end touching the moving jaw of the Micrometer. For diameter measurements, position
the specimen so that the jaws of the Micrometer clamp down on the sides of the object.
4. Use the thimble to close the measuring faces onto the specimen until a firm contact is made,
being careful not to apply too much pressure. The thimble is the small rotating wheel on the
Micrometer that moves the measuring faces closer together. You should use your thumb and
index finger to turn the thimble, applying only light pressure.
5. Once the measuring faces are in contact with the specimen, take note of the reading on the
digital display of the Micrometer. The reading will show the measurement of the specimen
in the units of measurement you selected earlier. If the measurement is less than one unit of
measurement, the display may show several decimal places (e.g. 0.250 mm).
6. Record the measurement in a table or notebook. Include the units of measurement and any
significant digits (e.g. 10.5 mm).
7. Repeat steps 3-6 for each specimen, making sure to reset the Micrometer to zero before
measuring each specimen.
8. After measuring all the specimens, turn off the Digital Micrometer and clean the measuring
faces with a soft cloth or tissue to remove any debris or dust.

Digital Micrometer Parts


46
Theory:
The Digital Micrometer is a precision instrument used to measure the length and diameter of an
object with high accuracy. The instrument consists of a measuring head, a spindle, a thimble, a
digital readout, and a frame.
The measurement of length is taken by placing the object between the measuring head and the
spindle and closing the measuring head until it comes in contact with the object. The length can
then be read off the digital readout in millimeters or inches, depending on the units selected.
The measurement of diameter is taken by using the spindle and anvil to measure the diameter of the
object. The object is placed between the spindle and anvil, and the thimble is rotated to bring the
measuring faces into contact with the object. The diameter can then be read off the digital readout
in millimeters or inches, depending on the units selected.

Operation of Digital Micrometer

The least count of the Digital Micrometer is the smallest distance that can be measured with the
instrument, and it is determined by the resolution of the digital readout. For example, if the digital
readout has a resolution of 0.001 mm and the instrument has a range of 0-25 mm, the least count of
the Digital Micrometer is given by:
Least Count = Range / Resolution = 25 mm / 0.001 mm = 0.025 mm
To ensure accuracy and precision, it is important to handle the Digital Micrometer carefully and
avoid excessive pressure when taking measurements. The measuring faces should be cleaned
regularly to avoid dust and debris from affecting the accuracy of the measurement. It is also
important to check the calibration of the instrument regularly to ensure that it is functioning
properly.
In summary, the Digital Micrometer is a precision instrument used to measure the length and
diameter of an object with high accuracy. The least count of the instrument is determined by the
resolution of the digital readout, and to ensure accuracy and precision, it is important to handle the
instrument carefully, clean the measuring faces regularly, and check the calibration of the
instrument regularly.

47
Table:

Sr. Specimens Hight Hight

No. (mm) (Inch)

1 4.5+46×0.02 1.75+16×0.01

2 5.7+7×0.02 2.25+47×0.01

3 15.2+28×0.02 6+5×0.01

4 15.05+19×0.02 5.9+20×0.01

5 3.7+49×0.02 1.45+7×0.01

Comments:

1. Make sure the Micrometer is calibrated and zeroed before taking measurements. Follow the
manufacturer's instructions to ensure proper calibration.
2. Use gentle pressure when closing the measuring faces onto the specimen to avoid distorting
or damaging the object.
3. Ensure the specimen is held securely and straight between the measuring faces to avoid any
slippage or tilt that may affect the measurement.
4. Use a steady hand when reading the measurement to avoid parallax error. Make sure you
are viewing the measurement straight-on, without any angle or tilt.
5. Take multiple measurements of each specimen to ensure consistency and accuracy. You can
take several measurements and then average the results to reduce any variability.
6. Clean the measuring faces of the Micrometer regularly to avoid dust and debris affecting the
accuracy of the measurements.
7. Avoid touching the measuring faces with bare hands, as skin oils can also affect the
accuracy of the measurements.

48

You might also like