0% found this document useful (0 votes)
37 views17 pages

Footprint Deformity Diagnosis Sensor

Uploaded by

alex
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views17 pages

Footprint Deformity Diagnosis Sensor

Uploaded by

alex
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

sensors

Article
Low Computational-Cost Footprint Deformities
Diagnosis Sensor through Angles, Dimensions
Analysis and Image Processing Techniques
J. Rodolfo Maestre-Rendon 1,2 , Tomas A. Rivera-Roman 1 ID , Juan M. Sierra-Hernandez 3 ,
Ivan Cruz-Aceves 4 ID , Luis M. Contreras-Medina 5 , Carlos Duarte-Galvan 6 ID and
Arturo A. Fernandez-Jaramillo 1, *
1 Unidad Académica de Ingeniería Biomédica, Universidad Politécnica de Sinaloa, Carretera Municipal Libre
Mazatlán Higueras km 3, Col. Genaro Estrada, Mazatlán Sin. 82199, Mexico;
[email protected] (J.R.M.-R.); [email protected] (T.A.R.-R.)
2 Center for Biomedical Technology, Polythecnic University of Madrid, Campus Montegancedo,
Pozuelo de Alarcón, Madrid 28223, Spain
3 Departamento de Ingeniería Electrónica, División de Ingenierías, Universidad de Guanajuato,
Carretera Salamanca-Valle de Santiago km 3.5 + 1.8, Comunidad de Palo Blanco,
Salamanca Gto. C.P. 36885, Mexico; [email protected]
4 CONACYT, Centro de Investigación en Matemáticas (CIMAT), A.C., Jalisco S/N, Col. Valenciana,
Guanajuato Gto. 36000, Mexico; [email protected]
5 CA Ingeniería de Biosistemas, División de Investigación y Posgrado, Facultad de Ingeniería,
Universidad Autónoma de Querétaro, Cerro de las campanas S/N,
Santiago de Querétaro Qro. 76010, Mexico; [email protected]
6 Facultad de Ciencias Físico-Matemáticas, Universidad Autónoma de Sinaloa, Av. De las Américas y Blvd.
Universitarios, Cd. Universitaria, Culiacán Sin. 80000, Mexico; [email protected]
* Correspondence: [email protected]; Tel.: +52-669-180-0695

Received: 19 October 2017; Accepted: 14 November 2017; Published: 22 November 2017

Abstract: Manual measurements of foot anthropometry can lead to errors since this task involves the
experience of the specialist who performs them, resulting in different subjective measures from the
same footprint. Moreover, some of the diagnoses that are given to classify a footprint deformity are
based on a qualitative interpretation by the physician; there is no quantitative interpretation of the
footprint. The importance of providing a correct and accurate diagnosis lies in the need to ensure that
an appropriate treatment is provided for the improvement of the patient without risking his or her
health. Therefore, this article presents a smart sensor that integrates the capture of the footprint, a low
computational-cost analysis of the image and the interpretation of the results through a quantitative
evaluation. The smart sensor implemented required the use of a camera (Logitech C920) connected
to a Raspberry Pi 3, where a graphical interface was made for the capture and processing of the
image, and it was adapted to a podoscope conventionally used by specialists such as orthopedist,
physiotherapists and podiatrists. The footprint diagnosis smart sensor (FPDSS) has proven to be
robust to different types of deformity, precise, sensitive and correlated in 0.99 with the measurements
from the digitalized image of the ink mat.

Keywords: embedded system; footprint measurements; Staheli arch index; Clarke’s angle;
Smirak-Chippaux index; biomedical image processing

1. Introduction
It is well known that accurate and quantitative measurements are critical when a clinical diagnosis
is given. Furthermore, manual measurements of the anthropometry of the foot can lead to wrong

Sensors 2017, 17, 2700; doi:10.3390/s17112700 www.mdpi.com/journal/sensors


Sensors 2017, 17, 2700 2 of 17

estimations because of the subjectivity involved in this task or the training and experience of the
person responsible for the measurements. The diagnosis of foot deformities in individuals requires
multiple measurements from the footprint to calculate a set of parameters that determine characteristics
presented in the arch [1]. Studies have shown an inconsistent performance of measurements of the
foot with traditional instruments, like a caliper or an ink mat [2–4]. When the orthopedist gives an
incorrect diagnosis, the appropriate treatment will not be given in time and, consequently, it would
affect the health of the patient. In order to be able to track the improvement of the patient over time
and check the effectiveness of the prescribed treatment, it is necessary that measures be quantitative,
consistent and precise every time. One of the most common deformities in the foot is the flat foot or
pes planus. Flat foot is a condition where the medial longitudinal arch of the foot collapses causing
partial or complete contact with the ground that will cause severe problems in the patient [5].
Conventional methods used in the clinical environment involve the use of a caliper by the
orthopedist to estimate the required measurements of the foot. It is rudimentary and exhaustive
to manually take all the measures and the variability, produced by the lack of precision which
will lead to errors in the calculated parameters. There are several methodologies proposed for the
computer-aided diagnosis of foot deformities. For example, Navarro et al. [6] designed a special
sole template with pressure sensors to evaluate the distribution of the plantar pressure in different
key parts. It is a good approach; however, it is necessary to manufacture different sizes of sole
templates to adapt the position of the sensors to specific foot sizes in order to obtain precise measures
in the appropriate areas. Hamza et al. in 2015 [7] used an ultrasonic sensor that passes under the
foot and acquires the height of the medial longitudinal arch. Nonetheless, the use of ultrasonic
sensors to measure the footprint in a straight line will ignore the measurements of different areas,
not present in the arch, which could be substantial when a diagnosis is given. On the other hand,
There are several computational techniques that are responsible for reconstruction of the footprint;
for example, Guerrero-Turrubiates et al. [8] proposed the use of a parabola detection algorithm in
the areas of the arch and the heel of the foot, also the finite element method are widely used to
reconstruct the footprint, even the complete foot or the pressure caused by itself [9–11]. Both providing
excellent results but with a high cost in terms of computing resources and time. It is very important to
consider the required resources when implementing the algorithm, especially when it is necessary to
generate an application for embedded devices or dispense with a high-performance and expensive
computer. A recent study [12] presented a system for the analysis of the foot arch by using an RGB-D
camera for 3D image reconstruction of the footprint; such system has shown good reliability in the
performed experiments. However, the measurements of the footprint and the relation of the forefoot
and the barefoot with the midfoot are left behind since they only focus on the analysis of the foot arch.
Consequently, the complete analysis of the footprint cannot be given and this could potentially generate
an insufficient diagnosis. Currently, there are no RBG camera-based sensors that automatically analysze
the measurements of the footprint. Table 1 shows a comparison between the methods described above
and our current approach (FPDSS).
Sensors 2017, 17, 2700 3 of 17

Table 1. Comparison of the methods used by different sensors to diagnose foot deformities.

Authors/Method Objective of the Method Type of Sensor Accuracy


Detection of flat foot using
Navarro et al. [6] Pressure sensors N/A 1
estimation techniques
Flatfoot detector using the
Hamza et al. [7] Ultrasonic sensors 100% in 20 subjects
height of the arch
Detection of parabolas
Guerrero-Turrubiates et al. [8] Footprint scanner N/A 1
dimensions in the footprint
3D image reconstruction of
Chun et al. [12] RGB-D camera ~97.32% in 11 subjects 2
the footprint
Detection of deformities in
FPDSS the footprint through RGB camera with reference 99.38% in 40 subjects 2
image processing
1 Accuracy not specified in article. 2 Mean of the accuracy in the three parameters calculated by the sensor.
Sensors 2017, 17, 2700 3 of 16
There are some devices in the market that provide digital images of the footprint but only at high
pressureThere areAdditionally,
areas. some devicesitinisthe market for
necessary thatthe
provide
user to digital
enterimages of the measurements
quantitative footprint but only at
of the
high pressure areas. Additionally, it is necessary for the user to enter quantitative
foot. These problems do not allow the automatic calculation of parameters required for the diagnosismeasurements of
ofthe foot. These
deformities in problems do not
the footprint. allow the
In Figure automatic
1, the calculation
conventional of used
devices parameters required for
in the diagnosis by the
the
diagnosis of deformities in the footprint. In Figure 1, the conventional devices used
orthopedist can be observed. The data obtained from the digital caliper and the ink mat will always in the diagnosis
by the
result inorthopedist
measurements can that
be observed. The data
are sensitive to theobtained
trainingfrom theorthopedist
of the digital caliper and the
because theink mat
area will
where
always result in measurements that are sensitive to the training of the orthopedist
the measurement must be made may vary from one patient to another [2]. Computerized podoscope because the area
where the measurement
(Plantoscopio Computarizado,mustPuebla,
be made may vary
Mexico) is anfrom onedevice
optical patient to another
with which it[2]. Computerized
is possible to scan
and visualize the footprint without any analysis that determines the condition present in the it
podoscope (Plantoscopio Computarizado, Puebla, Mexico) is an optical device with which is
foot.
possible to scan and visualize the footprint without any analysis that determines the condition
In the same way, ArcoScan (CiPar Ingeniería, Paraná, Argentina) is a podoscope that allows you
present in the foot. In the same way, ArcoScan (CiPar Ingeniería, Paraná, Argentina) is a podoscope
to digitalize and store the footprint but, with the difference to the previous one, it also allows the
that allows you to digitalize and store the footprint but, with the difference to the previous one, it
estimation of areas with higher pressure by using the intensity levels of the pixels in the region of
also allows the estimation of areas with higher pressure by using the intensity levels of the pixels in
the foot. However, these devices do not include a qualitative or quantitative analysis of the foot, like
the region of the foot. However, these devices do not include a qualitative or quantitative analysis of
measurements and angles of the footprint, and they will always require the interpretation by the
the foot, like measurements and angles of the footprint, and they will always require the
orthopedist in order to give a diagnosis. Having measurements on a real scale and the automatic
interpretation by the orthopedist in order to give a diagnosis. Having measurements on a real scale
calculation of parameters that indicate deformities in the foot is very important to provide more
and the automatic calculation of parameters that indicate deformities in the foot is very important to
efficient diagnosis. It is important to mention that foot deformities should be treated as quickly and
provide more efficient diagnosis. It is important to mention that foot deformities should be treated
efficiently as possible to reduce the effects they may cause in other areas of the body [13–15]. When the
as quickly and efficiently as possible to reduce the effects they may cause in other areas of the body
deformity is correctly diagnosed, an appropriate treatment is given to the patient.
[13–15]. When the deformity is correctly diagnosed, an appropriate treatment is given to the patient.

(a) (b) (c)


Figure 1. Traditional devices used for the measurement of the foot: (a) Digital caliper; (b) Ink mat; (c)
Figure 1. Traditional devices used for the measurement of the foot: (a) Digital caliper; (b) Ink mat;
Digital podoscope (modified from [3]).
(c) Digital podoscope (modified from [3]).

However, this process can be automatized using a digital footprint analysis smart sensor, which
However, this process
would considerably take outcan
allbe
theautomatized using
disadvantages a digital
of manual footprint analysis
measurement smart sensor,
while computing the
which would considerably take out all the disadvantages of manual measurement while computing the
parameters needed for the correct diagnosis. The objective of this article is to introduce a sensor that
integrates the capture of the plantar footprint, the analysis of the image and the interpretation of the
results through a quantitative evaluation. The advantage over conventional methods lies in the ability
to automatically detect footprint measurements and the angle of the longitudinal arch of the foot in
a real scale quickly and with minimal computational resources and, in the same way, the calculation
of three essential parameters to fully describe the footprint quantitatively: Staheli index, Clarke’s
Sensors 2017, 17, 2700 4 of 17

parameters needed for the correct diagnosis. The objective of this article is to introduce a sensor that
integrates the capture of the plantar footprint, the analysis of the image and the interpretation of the
results through a quantitative evaluation. The advantage over conventional methods lies in the ability
to automatically detect footprint measurements and the angle of the longitudinal arch of the foot in a
real scale quickly and with minimal computational resources and, in the same way, the calculation of
three essential parameters to fully describe the footprint quantitatively: Staheli index, Clarke’s angle
and Smirak-Chippaux Index. Consequently, it provides an accurate interpretation of the results for
both, the diagnosis and the orthopedist. Finally, the calculated parameters from the patient’s footprint
are saved and plotted to graphically observe the improvement over time to ensure that the treatment
given by the orthopedist is right for the condition, so the patient’s health is not to put at risk and the
recovery time is reduced. A smart sensor is presented for the automatic quantitative interpretation
and analysis
Sensors 2017, 17,of footprint images resulting in a computer-aided diagnosis of deformities in the foot.
2700 4 of 16

2.
2. Materials
Materials and
and Methods
Methods

2.1.
2.1. Smart
Smart Sensor
Sensor Setup
Setup
The
The implementation
implementation of of the
the smart
smart sensor
sensor presented
presented required
required thethe use
use of
of aa modified
modified regular
regular
podoscope, as we can observe in Figure 2; a one squared centimeter black sticker
podoscope, as we can observe in Figure 2; a one squared centimeter black sticker as a reference as a reference area area
was
placed on the top of a tempered glass with a frosted film [13] and a generic
was placed on the top of a tempered glass with a frosted film [13] and a generic camera was camera was embedded under
the glass, inunder
embedded this case, a Logitech
the glass, HD
in this ProaWebcam
case, LogitechModel
HD Pro C920 (Logitech,
Webcam Model Lausanne, Switzerland)
C920 (Logitech, was
Lausanne,
used. This allowed the capture of the footprint in an image file for the computer.
Switzerland) was used. This allowed the capture of the footprint in an image file for the computer. The parameters
for
Theimage capturefor
parameters of image
the camera
capturewere of set
thetocamera
40% brightness,
were set to 50% contrast,
40% 60% sharpness
brightness, and 60%
50% contrast, 60%
of the white
sharpness andbalance.
60% ofAs theshown
whiteinbalance.
Figure 3, Asthe camera
shown was connected
in Figure to an embedded
3, the camera was connectedcomputer
to an
(Raspberry
embedded Pi 3, Raspberry
computer Pi Fundation,
(Raspberry Cambridge,
Pi 3, Raspberry UK) with the
Pi Fundation, Raspbian Jessie
Cambridge, UK) withOperative System
the Raspbian
for theOperative
Jessie analysis ofSystem
the obtained
for the images
analysisthrough
of theanobtained
interaction of thethrough
images processing
an algorithm
interactionand
of the
the
input of the user in the graphical interface.
processing algorithm and the input of the user in the graphical interface.

U S B C am era to
R asp b erry P i 3

Figure 2. Adaptation of the original podoscope placing a frosted film on the top and the camera
Figure 2. Adaptation of the original podoscope placing a frosted film on the top and the camera
connected to the Raspberry Pi 3 via USB. Using an USB camera, the Raspberry Pi 3 acquires an image
connected to the Raspberry Pi 3 via USB. Using an USB camera, the Raspberry Pi 3 acquires an image of
of the patient’s footprints (left) and process it to display the results in the graphical user interface
the patient’s footprints (left) and process it to display the results in the graphical user interface (right).
(right).

The
The smart
smartsensor and
sensor the the
and involved process
involved can becan
process applied in any generic
be applied in anypodoscope by integrating
generic podoscope by
aintegrating
camera toa take the images of the footprint and using the software for analysis. In this
camera to take the images of the footprint and using the software for analysis. In this way,
conventional equipment
way, conventional can be
equipment canimproved
be improvedto provide more
to provide moreaccurate
accuratediagnostics.
diagnostics.Figure
Figure33 shows
shows
the process that the smart sensor follows to provide the results to the user.
process that the smart sensor follows to provide the results to the user.

Figure 3. Block diagram of the smart sensor.


(right).

The smart sensor and the involved process can be applied in any generic podoscope by
integrating a camera to take the images of the footprint and using the software for analysis. In this
way, conventional
Sensors 2017, 17, 2700 equipment can be improved to provide more accurate diagnostics. Figure 3 shows
5 of 17
the process that the smart sensor follows to provide the results to the user.

Figure 3. Block diagram of the smart sensor.


Figure 3. Block diagram of the smart sensor.

2.2. Footprint Parameters


2.2. Footprint Parameters
Three parameters were used for the analysis and evaluation of the FPDSS images and obtained
Three parameters were used for the analysis and evaluation of the FPDSS images and obtained
data. Therefore, this section explains the importance of the metrics used and how they are obtained
data. Therefore, this section explains the importance of the metrics used and how they are obtained in
in footprint images.
footprint images.
The Staheli arch index is used to describe the relation between the minimum width of the
The Staheli arch index is used to describe the relation between the minimum width of the midfoot
midfoot and the maximum width of the hindfoot. This index has been one of the most studied
and the maximum width of the hindfoot. This index has been one of the most studied parameters for
parameters for the description of footprint deformities and historically represents an important index
the description of footprint deformities and historically represents an important index for the detection
for the detection of flat foot [16]. In addition to the above, it is considered as an essential diagnostic
of flat foot [16]. In addition to the above, it is considered as an essential diagnostic method compared
to the talar—first metatarsal angle, commonly used by specialists in the field for the detection of
deformities in the
Sensors 2017,footprint
17, 2700 [17]. 5 of 16
Clarke’s angle or arch angle describes the measure of the internal longitudinal arch.
method compared to the talar—first metatarsal angle, commonly used by specialists in the field for
Accordingthe todetection
[18], this parameter may have been the first proposed for studies in deformities and
of deformities in the footprint [17].
posture, as it isClarke’s
proportional to the
angle or arch arch
angle height.
describes the This
measureindex
of thecomes
internalfrom the angle
longitudinal arch.between
Accordinga line that
goes from tothe outer
[18], point of the
this parameter mayforefoot
have beento thethe outer
first point
proposed forof the hindfoot
studies andand
in deformities another line
posture, as itconnecting
is proportional to the arch height. This index
the outer point of the forefoot and the inner point of the midfoot.comes from the angle between a line that goes from the
outer point of the forefoot to the outer point of the hindfoot and another line connecting the outer
Finally, the Chippaux-Smirak index is the relation between the minimum width of the midfoot
point of the forefoot and the inner point of the midfoot.
and the maximum Finally,width of the forefoot.
the Chippaux-Smirak index When the arch
is the relation of the
between foot is larger,
the minimum width ofthis index decreases
the midfoot
its value [18]. By using different parameters that entirely describe the footprint, the values
and the maximum width of the forefoot. When the arch of the foot is larger, this index decreases its obtained
value [18]. By using different parameters that entirely describe the
during the analysis allow a solid diagnosis performed by the FPDSS. Figure 4 describes the linesfootprint, the values obtained
during the analysis allow a solid diagnosis performed by the FPDSS. Figure 4 describes the lines
mentionedmentioned
above for the calculation of the different parameters.
above for the calculation of the different parameters.

Figure 4. Footprint parameters used by the FPDSS to evaluate the deformities in the footprint:
Figure 4. (a)Footprint parameters used by the FPDSS to evaluate the deformities in the footprint:
Staheli index; (b) Clarke’s angle; (c) Chippaux-Smirak index.
(a) Staheli index; (b) Clarke’s angle; (c) Chippaux-Smirak index.
2.3. Algorithms for Estimation of Measures in Footprint
The proposed algorithm is based on the application of image processing techniques. The
programming language used for the development of this algorithm was C. The design of a graphical
user interface (GUI) was performed with the use of GTK+ for a friendly user interaction with the
results and the algorithm. These methods were implemented using the OpenCV library [19] for the
manipulation of the footprint images.
In the first part, the camera acquires an 8-bit image in JPEG format with dimensions of 1920 ×
1080 pixels and RGB color space. Once the image is introduced in the software, several preprocessing
Sensors 2017, 17, 2700 6 of 17

2.3. Algorithms for Estimation of Measures in Footprint


The proposed algorithm is based on the application of image processing techniques.
The programming language used for the development of this algorithm was C. The design of a
graphical user interface (GUI) was performed with the use of GTK+ for a friendly user interaction with
the results and the algorithm. These methods were implemented using the OpenCV library [19] for
the manipulation of the footprint images.
In the first part, the camera acquires an 8-bit image in JPEG format with dimensions
of 1920 × 1080 pixels and RGB color space. Once the image is introduced in the software,
several preprocessing steps are made. Firstly, the RGB color image was converted into CIELAB
(CIE 1976 L*a*b*) color space. CIELAB provides a better interpretation of the image for the algorithm
since the luminosity is separated in a single channel and totally ignored in the color channels (a* and b*).
The output channels of the CIELAB model gave a better correlation with the visual interpretation
of the footprint images, something that is of great importance when the area of the footprint needs
to be separated from the skin of the foot. After that, a range of values in the a* and b* channel was
tested for binarizing the image based on an image thresholding step. The thresholding for a* channel
was set to 122 and for b* channel was set to 131, since that combination of thresholds correspond to
any intensity of the yellow and red color in the CIELAB color space. That means that any intensity
above the thresholds will be set as white pixels (1) and the rest will be black pixels (0), the range of
intensity per pixel goes from 0 to 255 in 8-bit images. Subsequently, the region of the footprint is
correctly segmented in a binary image. Then, the binarized image of the previous step is dilated to
expand the size of the footprint and to cover all the small parts that remained outside the foot during
the thresholding process. Finally, the image is segmented only to get the area that we are interested
in; that is to say, the footprint excluding the toes. This was performed using a function that detects
the contour of the biggest object in the image and traces a boundary so that everything outside the
contour is set as background or black pixels.
After the preprocessing stage of the footprint is done, the algorithm looks for the one square
centimeter reference in the R (red) channel of the image, since it is the channel of the RGB color model
where it is easier to obtain the reference. This is accomplished with a function that iterates over the
image and looks for continuous black pixels in the bottom and counts them to know the area of the
square [20]. The number of pixels in one square centimeter is saved for using it when a conversion
from pixels to centimeters is requested.
Afterwards, the algorithm finds the key points of three parameters in the processed image used
for the analysis and detection of deformities in the footprint [1,21]. The parameters automatically
determined by this algorithm are the Staheli arch index, Clarke’s angle and the Chippaux-Smirak index.
For the calculation of the parameters, the foot is divided into three main areas: hindfoot, midfoot and
forefoot as shown in Figure 5. Before finding the parameters in the footprint, a function is implemented to
find if there is a right foot, left foot or both by iterating over the binarized image to find the number of
contours and their corresponding position. By knowing which foot is on the image, the right functions are
applied and, therefore, it maximizes the precision of the calculation and diagnosis. Finally, the algorithm
first obtains the measurement of the line in pixels, by using the location of the pixels at the starting
point and the ending point with Equation (1), and then transforms it to centimeters using the previously
calculated reference: q
Z = ( x2 − x1 )2 + ( y2 − y1 )2 (1)
find the number of contours and their corresponding position. By knowing which foot is on the
image, the right functions are applied and, therefore, it maximizes the precision of the calculation
and diagnosis. Finally, the algorithm first obtains the measurement of the line in pixels, by using the
location of the pixels at the starting point and the ending point with Equation (1), and then transforms
it to centimeters using the previously calculated reference:
Sensors 2017, 17, 2700 7 of 17
Z = ( − ) ( − ) (1)

Figure 5. Divisions of the footprint in 3 zones.


Figure 5. Divisions of the footprint in 3 zones.

The Staheli arch index was estimated by the division of the narrowest line in the midfoot (Q)
over The Staheli line
the widest archinindex was estimated
the hindfoot (R) [16].byThethenarrowest
division oflinetheinnarrowest
the midfoot lineisincalculated
the midfoot from (Q)
a
over the widest line in the hindfoot (R) [16]. The narrowest line in the midfoot is
function that iterates row by row from right to left (for left foot) or left to right (for right foot), and itcalculated from a
function
finds the that
pixeliterates
that is row by row
further fromfrom right to left
the starting (for
point left foot)
(inner pointororleft to Then,
q2). right (for right afoot),
it draws and it
horizontal
finds the goes
line that pixelfromthat is further
that pointfrom theend
to the starting
of thepoint (inner point
segmented footprintor q2).
(q1)Then,
at theitleft
draws
(foraleft
horizontal
foot) or
line that goes from that point to the end of the segmented footprint (q1) at the
right (for right foot). For the widest line in the hindfoot, the function is the same as above, but left (for left foot) or right
this
(for
time it finds the pixel that is the closest one from the starting position of the algorithm (outertime
right foot). For the widest line in the hindfoot, the function is the same as above, but this pointit
finds
or r2)the
andpixel that is(r1)
the point theatclosest
the endoneoffrom the startingline
the horizontal position
throughof the
the algorithm (outer point
hindfoot. Figure or r2) and
6 illustrates the
the point (r1) at the
process described above. end of the horizontal line through the hindfoot. Figure 6 illustrates the process
described above.
Sensors 2017, 17, 2700 7 of 16

q1 q1 Q Q
q2 q2

r1 r2 r2 r1 R R

Figure 6. The process of the Staheli arch index calculation.


Figure 6. The process of the Staheli arch index calculation.

The Clarke’s angle is calculated by taking the angle between a line T, that goes from the outer
point of Clarke’s
The the forefoot angle is calculated
to the outer pointbyoftaking the angle
the hindfoot, and between
a line S,athat
linegoes
T, that
fromgoes
thefrom
outerthe outer
point of
point of the forefoot to the outer point of the hindfoot, and a line S, that goes from
the forefoot and the inner point of the midfoot [22]. The line T is calculated by doing an iteration, in the outer point of
the
theforefoot
forefootandandthe inner point
hindfoot, pixelof
bythe midfoot
pixel [22]. to
from right The
leftline
(forTleft
is calculated
foot) or left bytodoing
right an
(foriteration,
right foot) in
the forefoot and hindfoot, pixel by pixel from right to left (for left foot) or left to right
to find the pixel (t1) that is the closest to the starting point in both areas. The line S is calculated by (for right foot)
todrawing
find theapixel (t1) that
line from the isouter
the closest
point intothe
theforefoot
starting(t1),
point thatin was
bothfound
areas.inThetheline S is calculated
previous step, to the by
drawing a line
inner point of from the outer
the midfoot point
(s1). in drawing
After the forefoot
and(t1), that wasthe
estimating found in the previous
measurement of bothstep, to the
lines, the
inner
anglepoint of the midfoot
estimation requires(s1). After drawingofand
a measurement estimating
a third line, F,the measurement
that goes from the of both lines,
outer pointtheofangle
the
estimation requires
hindfoot (f1) to theainner
measurement of midfoot
point of the a third line,
(s1).F,Figure
that goes fromthe
7 shows thelines
outerspecified
point ofabove.
the hindfoot
Equation(f1)
(2) comes from the calculation of the angle of an isosceles triangle and it is used for Clark’s angle
estimation in this analysis:
S − T − F
Clark’s Angle = cos ( ) (2)
−2 T F
point of the forefoot to the outer point of the hindfoot, and a line S, that goes from the outer point of
the forefoot and the inner point of the midfoot [22]. The line T is calculated by doing an iteration, in
the forefoot and hindfoot, pixel by pixel from right to left (for left foot) or left to right (for right foot)
to find the pixel (t1) that is the closest to the starting point in both areas. The line S is calculated by
drawing
Sensors 2017,a17,
line
2700from the outer point in the forefoot (t1), that was found in the previous step, to8 the
of 17
inner point of the midfoot (s1). After drawing and estimating the measurement of both lines, the
angle estimation requires a measurement of a third line, F, that goes from the outer point of the
to the inner point of the midfoot (s1). Figure 7 shows the lines specified above. Equation (2) comes
hindfoot (f1) to the inner point of the midfoot (s1). Figure 7 shows the lines specified above. Equation
from the calculation of the angle of an isosceles triangle and it is used for Clark’s angle estimation in
(2) comes from the calculation of the angle of an isosceles triangle and it is used for Clark’s angle
this analysis:
estimation in this analysis:
S2 − T2 − F2
Clark0 s Angle = cos−1 S( − T − F ) (2)
Clark’s Angle = cos ( −2 T F ) (2)
−2 T F

t1 t1 T T

s1 s1 S S

F F
f1 f1

Figure 7. The process of Clarke’s angle estimation.


Figure 7. The process of Clarke’s angle estimation.

The Chippaux-Smirak index requires the measurement of the widest line (P) of the forefoot
The we
because Chippaux-Smirak
already have theindex requires
narrowest the in
line (Q) measurement
the midfoot from of the
thewidest
Staheli line
arch(P) of calculation.
index the forefoot
because we already
The algorithm have
iterates thethe
over narrowest
forefootline (Q) the
to find in the midfoot
widest line from
usingthe Staheli
outer (p1) arch index(p2)
and inner calculation.
points
The algorithm iterates over the forefoot to find the widest line using outer (p1) and inner
from the starting position of the algorithm (middle of the image), as shown in Figure 8. After having (p2) points
from the starting
calculated position of the
the measurements of algorithm
both lines,(middle
the index ofisthe image), by
estimated as shown
Equationin (3)
Figure
[18]:8. After having
calculated the measurements of both lines, the index is estimated Q by Equation (3) [18]:
Chippaux − Smirak index (%) = × 100 (3)
PQ
Chippaux − Smirak index (%) = × 100 (3)
P
Sensors 2017, 17, 2700 8 of 16

p2 p2 P P
p1
p1
q1 q1 Q Q
q2 q2

Figure 8. Process of the Chippaux-Smirak index calculation.


Figure 8. Process of the Chippaux-Smirak index calculation.

Finally, the algorithm saves the acquired parameters information for further analysis and gives
the Finally, the algorithm
possibility to add new saves the in
lines acquired parameters
the image with aninformation
automaticfor further analysis
calculation of the and gives the
distance in
possibility to add
centimeters. new lines
In Figure 9, theinresults
the image with
of each anofautomatic
part calculation
the algorithm of the
are shown distance in centimeters.
visually.
In Figure 9, the results of each part of the algorithm are shown visually.
Figure 8. Process of the Chippaux-Smirak index calculation.

Finally, the algorithm saves the acquired parameters information for further analysis and gives
the possibility
Sensors 2017, 17, 2700to
add new lines in the image with an automatic calculation of the distance 9 ofin17
centimeters. In Figure 9, the results of each part of the algorithm are shown visually.

Figure9.9. Steps
Figure Steps of
of the
the algorithm
algorithm for the analysis of the footprint:
footprint: (a)
(a) RGB
RGB image
image obtained
obtainedfrom
from
podoscope;(b)
podoscope; (b)Thresholding
Thresholdingatat183
183ofof red
red channel;
channel; (c)(c) Segmentation
Segmentation of the
of the footprint
footprint andand removal
removal of
of toes;
toes;
(d) (d) Location
Location of the of the reference
reference (square(square centimeter)
centimeter) in thein the image;
image; (e) Staheli
(e) Staheli arch index;
arch index; (f) Clarke’s
(f) Clarke’s angle;
angle;
(g) (g) Chippaux-Smirak
Chippaux-Smirak index. index.

2.4.Graphic
2.4. GraphicUser
UserInterface
Interface
The graphic
The graphic user
user interface
interface (GUI)
(GUI) was
was developed
developed toto provide
provide aa friendly
friendly tool
tool for
for the
theuser
usertoto
quantitative analyze
quantitative analyze and
and measure
measure thethe footprint
footprint images
images by
by implementing
implementingthe theproposed
proposedalgorithm.
algorithm.
The first step consists in using the camera for acquiring the footprint image and,once
The first step consists in using the camera for acquiring the footprint image and, onceititisisselected,
selected,
the algorithm processes the image and gives the output by automatically drawing the aforementioned
lines, with their respective measurements in centimeters, and calculating the three parameters.
As shown in Figure 10, the GUI has some extra functions that give the user the possibility to add
new lines, or modify and remove lines that are already drawn in the image. This is performed because
the image is placed in the bottom of a drawing area where the user can take manually measurements
using a mouse. The software records the coordinates of the starting point and the ending point of the
new line so the conversion from pixels to centimeters can be performed. Also, it recognizes when a
click is given in a specific line so it can be either modified or erased. There are four tabs on the top of
the image; each one includes different lines drawn by the software. For example, the first one includes
all the lines that the algorithm used for the detection of the three parameters. The rest of the tabs
include just the lines used for the estimation of the index indicated in its own title.
using a mouse. The software records the coordinates of the starting point and the ending point of the
new line so the conversion from pixels to centimeters can be performed. Also, it recognizes when a
click is given in a specific line so it can be either modified or erased. There are four tabs on the top of
the image; each one includes different lines drawn by the software. For example, the first one includes
all the2017,
Sensors lines17,that
2700 the algorithm used for the detection of the three parameters. The rest of the 10tabs
of 17
include just the lines used for the estimation of the index indicated in its own title.

Figure 10. The graphical user interface of the footprint analysis sensor.
Figure 10. The graphical user interface of the footprint analysis sensor.
The parameters that result from the analysis of the footprint can be plotted and saved over time
The parameters
to graphically observethat
theresult from theof
improvement analysis of theasfootprint
the patient, depictedcan be plotted
in Figure andthe
11, and saved over time
effectiveness
to the
of graphically observe
prescribed the improvement
treatment. of the
There are three patient,
plots, one foras depicted in Figure
each index, and the11, and can
index the effectiveness
be observed
of the
for prescribed
each treatment. There
foot independently. arethe
Finally, three
userplots, onepossibility
has the for each index, and as
to export thePDF
indexor can bethe
print observed
image
for each foot independently.
with the drawn lines. Finally, the user has the possibility to export as PDF or print the image
with the drawn lines.
Sensors 2017, 17, 2700 11 of 17
Sensors 2017, 17, 2700 10 of 16

(a) (b)

(c)
Figure 11. An example plots of a patient improvement over time with the evaluation of different
Figure 11. An example plots of a patient improvement over time with the evaluation of different
parameters: (a) Staheli Index; (b) Chippaux-Smirak Index; (c) Clarke’s Angle.
parameters: (a) Staheli Index; (b) Chippaux-Smirak Index; (c) Clarke’s Angle.
2.5. Experimental Setup
2.5. Experimental Setup
Precision and accuracy experiments were designed to evaluate the robustness and reliability of
the proposed
Precision and algorithm.
accuracy The experiments were
experiments wereperformed
designedinto15evaluate
women and the25robustness
men between thereliability
and ages
of 18 and 24 that presented different types of foot; cavus foot, normal
of the proposed algorithm. The experiments were performed in 15 women and 25 men betweenfoot and flat foot. Automatic
measurements were implemented using our designed software and manual measurements required
the ages of 18 and 24 that presented different types of foot; cavus foot, normal foot and flat foot.
a manually quantitative interpretation of the ink mat.
Automatic measurements were implemented using our designed software and manual measurements
The precision experiment consisted in asking the subjects to step on the podoscope, in a
required a manually
bipedalism quantitative
posture, so that we interpretation
could acquire the of contact
the inkregion
mat. of the feet with the glass using the
The precision
camera. experiment
Next, the consisted
subject stepped in asking
off the the subjects
podoscope. to step
This process wason the podoscope,
repeated five timesin soathat
bipedalism
we
posture, so that we could acquire the contact region of the feet with the glass
could obtain the enough number of images per subject to evaluate the precision of the algorithm. We using the camera.
Next,collected
the subject stepped
200 images of off the podoscope.
footprints This process
from the subjects wasrespective
with their repeatedcalculation
five timesofsoparameters
that we could
forthe
obtain each foot. number of images per subject to evaluate the precision of the algorithm. We collected
enough
200 images In of
addition, the accuracy
footprints from the experiment
subjects with requires
their the use of acalculation
respective reference measurement
of parameters to compare
for each foot.
In addition, the accuracy experiment requires the use of a reference measurement to perform
the calculated parameters from the footprint images. Therefore, the ink mat was used to compare the
manual measurements as a reference for our method. After the footprint images acquiring process,
calculated parameters from the footprint images. Therefore, the ink mat was used to perform manual
we asked the subject to place the foot individually on the ink mat. Then, we got the footprint on a
measurements as a reference for our method. After the footprint images acquiring process, we asked
sheet of paper where we could calculate the manual measurements of the parameters obtained by
the subject to place
our software. the foot
In order to getindividually on the ink
the most comparable datamat. Then,
and to wethe
address gothuman
the footprint
error fromon a sheet of
manual
papermeasurements,
where we could calculate the manual measurements of the parameters
we also processed the footprints from the ink mat using our algorithm. obtained by our software.
In order to get the most comparable data and to address the human error from manual measurements,
we also processed the footprints from the ink mat using our algorithm.
Sensors 2017, 17, 2700 12 of 17

Sensors 2017, 17, 2700 11 of 16

3. Results
3. Results
Once the analysis was performed in the collected images, we were able to compare the calculated
Once the analysis was performed in the collected images, we were able to compare the calculated
parameters obtained using our proposed method. For each subject, a precision analysis was performed
parameters obtained using our proposed method. For each subject, a precision analysis was
using the one-sigma
performed using the and three-sigma
one-sigma rule, as described
and three-sigma in [23], in
rule, as described which tells us
[23], which that
tells us 99.7% of the
that 99.7%
data must fall within three standard deviations of the mean to test the distribution normality
of the data must fall within three standard deviations of the mean to test the distribution normality of the
dataset [24].
of the dataset [24].

(a)

(b)

(c)
Figure 12. Precision analysis results where the vertical axis represents the value of the index and the
Figure 12. Precision analysis results where the vertical axis represents the value of the index and
horizontal axis is the right and left foot of the subjects arranged two by two: (a) Staheli index; (b)
the horizontal axis is the right and left foot of the subjects arranged two by two: (a) Staheli index;
Chippaux-Smirak Index; (c) Clarke’s Angle.
(b) Chippaux-Smirak Index; (c) Clarke’s Angle.
Sensors 2017, 17, 2700 13 of 17

The mentioned rule was applied on every index (Staheli, Chippaux-Smirak and Clarke index) of a
dataset of five images per subject’s foot (40 subjects), as we can observe in Figure 12; where the obtained
data from both feet has been sorted in an ascending way according to the value of the right foot of the
subject and placing the value of the left foot next to it. All data have satisfied the statistical analysis
showing the robustness of the implemented algorithm.Specifically, the analysis of the one-sigma rule
tells us that approximately 68% of the data must lie between one standard deviation from the mean
of a set of data. This was performed on data and the results are presented in Table 2, which shows
the number of cases with a specific amount of values that are outside of the analysis of one-sigma per
every subject’s foot. Since there were no cases with more than three values outside of the one-sigma
evaluation, we chose to include only 2, 1 and 0 cases in the table.

Table 2. Number of cases with the respective amount of values (2, 1 or 0) out of the one-sigma (1σ) rule
in every calculated index from the dataset of a subject’s foot.

Amount of Values out of 1σ Rule


Index Name
2 1 0
Staheli Index 14 57 9
Chippaux-Smirak Index 18 46 16
Clarke Index 12 49 19

In order to perform a complete analysis of the accuracy of our method, three large sets of data per
index were compared; the data acquired by the proposed method, data from the manual measurements
in the footprints obtained from the ink mat on a sheet of paper, and the data obtained by applying
our algorithm on the digitalized footprint of the ink mat. To easily visualize and compare the data,
the acquired footprints were classified into five groups, as observed in Table 3, based on the limits
established in [25–27]: cavus foot, cavus-normal foot, normal foot, normal-flat foot and flat foot.

Table 3. The range for the classification into five groups of footprints whereas: 1—Cavus foot;
2—Cavus-Normal foot; 3—Normal foot; 4—Normal-Flat foot; 5—Flat foot.

Classification Groups
Index Name
1 2 3 4 5
Staheli Index <0.3 0.3–0.4 0.4–0.7 0.7–0.8 >0.8
Chippaux-Smirak Index <0.22 0.22–0.3 0.3–0.5 0.5–0.7 >0.7
Clarke Index >50 38–50 26–38 15–26 <15

The comparison of the three datasets for each index is shown in Figure 13.
Furthermore, the relationship between the three methods was verified using the Pearson correlation
coefficient, so we could obtain and compare the accuracy and reliability of the proposed method
among the others [28]. Table 4 indicates the results of the correlation analysis.
Clarke Index >50 38–50 26–38 15–26 <15

The comparison of the three datasets for each index is shown in Figure 13. Furthermore, the
relationship between the three methods was verified using the Pearson correlation coefficient, so we
could2017,
Sensors obtain
17, and
2700 compare the accuracy and reliability of the proposed method among the others
14 [28].
of 17
Table 4 indicates the results of the correlation analysis.

Sensors 2017, 17, 2700 13 of 16

(a) Figure 13. Cont. (b)

(c)
Figure 13.
Figure 13. Accuracy
Accuracyanalysis
analysisperformed
performedonon three different
three datasets
different (proposed
datasets (proposed method,
method,manual
manualink
mat and digitalized ink mat). The vertical axis represents the five groups of classified
ink mat and digitalized ink mat). The vertical axis represents the five groups of classified footprints. The
horizontal axis
footprints. The gives the number
horizontal of cases
axis gives in the respective
the number of cases in group: (a) Staheli
the respective index;
group: (a)(b) Chippaux-
Staheli index;
Smirak Index; (c) Clarke’s Angle.
(b) Chippaux-Smirak Index; (c) Clarke’s Angle.

Table 4.
Table 4. Comparison
Comparison of values obtained
of values from the
obtained evaluated
from methodsmethods
the evaluated and their and
respective
their correlation
respective
coefficient.
correlation coefficient.
Compared Methods Correlation Coefficient 1
Compared Methods Correlation Coefficient 1
Ink mat (manual) vs. Ink mat (digitalized) 0.9618
Ink mat (manual) vs. Ink mat (digitalized) 0.9618
FPDSS
FPDSS 2 vs.
2 vs. Ink mat (manual)
Ink mat (manual) 0.96190.9619
FPDSS 2 vs.
FPDSS 2 vs. Ink mat (digitalized)
Ink mat (digitalized) 0.99380.9938
1Calculationsbased
Calculations basedonon
thethe method
method of Pearson
of Pearson correlation
correlation coefficient
coefficient [28]. 2=FPDSS
[28]. 2 FPDSS = Footprint
Footprint Diagnosis
Smart Sensor.
Diagnosis Smart Sensor.

4. Discussion and Conclusions


Through the analysis of the datadata obtained
obtained inin the
the performed
performed experiments,
experiments, we can infer the
subjectivity of manual measurements of the footprint and the respective calculation of three major
parameters: Staheli,
Staheli, Chippaux-Smirak
Chippaux-Smirak and and Clarke
Clarke index,
index, thus,
thus, our
our proposed
proposed smart sensor for
measuring and diagnosing deformities in the footprint through image processing processing eliminates
eliminates the
subjectivity of manual
manual diagnosis.
diagnosis. Additionally,
Additionally, itit performs
performsaaquantitative
quantitativeanalysis
analysisand
andpresents
presentsa
astandard
standard measurement
measurement forfor
thethe correct
correct diagnosis
diagnosis andand treatment
treatment of the
of the patient.
patient. For For
thatthat reason,
reason, the
the quantitative interpretation gives a more accurate perspective of the medical condition
quantitative interpretation gives a more accurate perspective of the medical condition of a patient, of a
because the smart sensor records the history of parameters over consultations to evaluate the
effectiveness of the treatment. The mentioned advantages in the diagnosis using the implemented
method are not present in any of the current methods for footprint analysis. Although the
experiments were conducted on people within the age range of 17–24 years, they were carefully
selected to encompass the different footprint classification groups.
Sensors 2017, 17, 2700 15 of 17

patient, because the smart sensor records the history of parameters over consultations to evaluate the
effectiveness of the treatment. The mentioned advantages in the diagnosis using the implemented
method are not present in any of the current methods for footprint analysis. Although the experiments
were conducted on people within the age range of 17–24 years, they were carefully selected to
encompass the different footprint classification groups.
The precision analysis shows the reliability and consistent of the proposed smart sensor since all
measurements for every subject stay in the specified range of 3 standard deviations from the mean.
This confirms the distribution normality of the data proving that the calculated measurements for
any subject are obtained with consistency and without significant variation. In addition, the accuracy
analysis shows the difficulties of manual calculation of the parameters from footprints obtained using
the ink mat, where results depend on the experience and perception of the technician performing them.
At the same time, the ink mat can lead to errors since the ink placed on the paper can take a larger area
than expected, leading to significantly differences when compared to others methods as stated in [29].
The implemented method indicates that the measurements are correlated in 0.99 when compared to a
digitalized version of the footprint from an ink mat. Consequently, we were able to correctly detect
and keep track of abnormalities of the footprint by the Staheli, Chippaux-Smirak and Clarke Index in
40 subjects using our proposed method of image processing and a modified podoscope.
First, a comparison between the results of the implemented method and the manual measurements
was made and we found an interesting disparity in the data. After that, the footprint sheets from
the ink mat were scanned and the implemented algorithm was applied, resulting in an improved
correlation of our results and the scanned footprints since the average error at classifying them went
from 21% to 8%. As verified by several studies [2], the standardization of measurement techniques is
required to reduce the inter-observer and random errors, something that it is hard to achieve when
different people manually calculate specific parameters of the footprint. Recently, Su et al. in 2016 [30]
described the estimation of plantar pressure and arch index through the intensity of the pixels from a
digitalized version of the footprints from an ink mat but still using a qualitative interpretation of the
obtained data. At the same time, others [12] have proposed a solution for the footprint deformities
using an RGB-D camera for reconstruction and performing a quantitative analysis from the foot arch
to obtain several arch parameters.
In this study, a footprint diagnosis smart sensor was proven to be robust to different types of
deformity, precise, sensitive and adaptable to most podoscopes used by specialists. In consequence, the
possibility of performing a quantitative analysis through the presented smart sensor allows the operator
to carry out a statistical control of the patient over consultations, resulting in an evaluation of the
effectiveness of the prescribed treatment. Additionally, the low computational cost of the implemented
algorithm allows it to be used in embedded devices like a Raspberry Pi. Above all, this smart sensor
will ultimately result in an accurate quantitative diagnosis that will significantly improve the patient’s
condition in less time using the proper treatment. Further research is needed to correlate the footprint
deformations on a bi-pedestation and mono-pedestation position and, therefore, perform an analysis
of the postural effects in the footprint.

Acknowledgments: This work was partially supported by Apoyo a la Incorporación de Nuevos PTC de PRODEP
UPSIN-PTC-030. Also, we want to thank Andrea Catalina de la Concepción Correa Giraldo and Absalón Eduardo
Flores Portillo for supervising the biomechanical measurements at the Licenciatura en Terapia Física clinic at the
Universidad Politécnica de Sinaloa. We would like to thank Karenina Haro for the final details in the English
editing. Finally, this work has been adapted an image from the original article [3] of J. Foot Ankle Res.
Author Contributions: Design of low-computational cost algorithm were involved Ivan Cruz-Acevez,
Luis M. Contreras-Medina, the adaptation of podoscope were design by Tomas A. Rivera-Roman,
J. Rodolfo Maestre-Rendon, Carlos Duarte-Galvan, optical setup and camera adaptation Juan M. Sierra-Hernandez,
algorithms and hardware integration by Tomas A. Rivera-Roman and Arturo A. Fernandez-Jaramillo,
The experimental design and measurements were performed by Tomas A. Rivera-Roman, J. Rodolfo Maestre-Rendon,
Arturo A. Fernandez-Jaramillo, Finally the structure and article writing by Tomas A. Rivera-Roman,
J. Rodolfo Maestre-Rendon and Arturo A. Fernandez-Jaramillo.
Conflicts of Interest: The authors declare no conflict of interest.
Sensors 2017, 17, 2700 16 of 17

Ethical Statements: All subjects gave their informed consent for inclusion before they participated in the study.
The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved.

References
1. Queen, R.M.; Mall, N.A.; Hardaker, W.M.; Nunley, J.A. Describing the medial longitudinal arch using
footprint indices and a clinical grading system. Foot Ankle Int. 2007, 28, 456–462. [PubMed]
2. Kouchi, M.; Mochimaru, M.; Tsuzuki, K.; Yokoi, T. Interobserver errors in anthropometry. J. Hum. Ergol.
1999, 28, 15–24.
3. Lee, Y.-C.; Lin, G.; Wang, M.-J.J. Comparing 3D foot scanning with conventional measurement methods.
J. Foot Ankle Res. 2014, 7, 44. [CrossRef] [PubMed]
4. Mall, N.A.; Hardaker, W.M.; Nunley, J.A.; Queen, R.M. The reliability and reproducibility of foot type
measurements using a mirrored foot photo box and digital photography compared to caliper measurements.
J. Biomech. 2007, 40, 1171–1176. [CrossRef] [PubMed]
5. Van Boerum, D.H.; Sangeorzan, B.J. Biomechanics and pathophysiology of flat foot. Foot Ankle Clin. 2003, 8,
419–430. [CrossRef]
6. Navarro, L.A.; García, D.O.; Villavicencio, E.A.; Torres, M.A.; Nakamura, O.K.; Huamaní, R.; Yabar, L.F.
Opto-electronic system for detection of flat foot by using estimation techniques: Study and approach of
design. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine
and Biology Society (EMBC), Buenos Aires, Argentina, 31 August–4 September 2010; pp. 5768–5771.
7. Hamza, A.O.; Ahmed, H.K.; Khider, M.O. A new noninvasive flatfoot detector. J. Clin. Eng. 2015, 40, 57–63.
[CrossRef]
8. Guerrero-Turrubiates, J.D.J.; Cruz-Aceves, I.; Ledesma, S.; Sierra-Hernandez, J.M.; Velasco, J.;
Avina-Cervantes, J.G.; Avila-Garcia, M.S.; Rostro-Gonzalez, H.; Rojas-Laguna, R. Fast parabola detection
using estimation of distribution algorithms. Comput. Math. Methods Med. 2017, 2017, 6494390. [CrossRef]
[PubMed]
9. Bates, K.; Savage, R.; Pataky, T.; Morse, S.; Webster, E.; Falkingham, P.; Ren, L.; Qian, Z.; Collins, D.; Bennett, M.
Does footprint depth correlate with foot motion and pressure? J. R. Soc. Interface 2013, 10, 20130009. [PubMed]
10. Cheung, J.T.-M.; Zhang, M. A 3-dimensional finite element model of the human foot and ankle for insole
design. Arch. Phys. Med. Rehabil. 2005, 86, 353–358. [CrossRef] [PubMed]
11. Morales-Orcajo, E.; Bayod, J.; de Las Casas, E.B. Computational foot modeling: Scope and applications.
Arch. Comput. Methods Eng. 2016, 23, 389–416.
12. Chun, S.; Kong, S.; Mun, K.-R.; Kim, J. A foot-arch parameter measurement system using a RGB-D camera.
Sensors 2017, 17, 1796. [CrossRef] [PubMed]
13. Burns, J.; Keenan, A.-M.; Redmond, A. Foot type and overuse injury in triathletes. J. Am. Podiatr. Med. Assoc.
2005, 95, 235–241. [CrossRef] [PubMed]
14. Chuckpaiwong, B.; Nunley, J.A.; Mall, N.A.; Queen, R.M. The effect of foot type on in-shoe plantar pressure
during walking and running. Gait Posture 2008, 28, 405–411. [CrossRef] [PubMed]
15. Dahle, L.K.; Mueller, M.; Delitto, A.; Diamond, J.E. Visual assessment of foot type and relationship of foot
type to lower extremity injury. J. Orthop. Sports Phys. Ther. 1991, 14, 70–74. [CrossRef] [PubMed]
16. Staheli, L.T.; Chew, D.E.; Corbett, M. The longitudinal arch. J. Bone Joint Surg. Am. 1987, 69, 426–428.
[PubMed]
17. Plumarom, Y.; Imjaijitt, W.; Chaiphrom, N. Comparison between staheli index on harris mat footprint and
talar-first metatarsal angle for the diagnosis of flatfeet. J. Med. Assoc. Thail. 2014, 97, S131–S135.
18. Shiang, T.-Y.; Lee, S.-H.; Lee, S.-J.; Chu, W.C. Evaluating different footprints parameters as a predictor of
arch height. IEEE Eng. Med. Biol. Mag. 1998, 17, 62–66. [CrossRef] [PubMed]
19. Bradski, G. Opencv library. Dr. Dobbs’s J. 2000, 25, 120–126.
20. Vazquez-Cruz, M.; Jimenez-Garcia, S.; Luna-Rubio, R.; Contreras-Medina, L.; Vazquez-Barrios, E.;
Mercado-Silva, E.; Torres-Pacheco, I.; Guevara-Gonzalez, R. Application of neural networks to estimate
carotenoid content during ripening in tomato fruits (solanum lycopersicum). Sci. Hortic. 2013, 162, 165–171.
[CrossRef]
Sensors 2017, 17, 2700 17 of 17

21. Aruntammanak, W.; Aunhathaweesup, Y.; Wongseree, W.; Leelasantitham, A.; Kiattisin, S. Diagnose flat
foot from foot print image based on neural network. In Proceedings of the 2013 6th Biomedical Engineering
International Conference (BMEiCON), Amphur Muang, Thailand, 23–25 October 2013; pp. 1–5.
22. Cavanagh, P.R.; Rodgers, M.M. The arch index: A useful measure from footprints. J. Biomech. 1987, 20,
547–551. [CrossRef]
23. Fernandez-Jaramillo, A.A.; de Jesus Romero-Troncoso, R.; Duarte-Galvan, C.; Torres-Pacheco, I.;
Guevara-Gonzalez, R.G.; Contreras-Medina, L.M.; Herrera-Ruiz, G.; Millan-Almaraz, J.R. FPGA-based
chlorophyll fluorescence measurement system with arbitrary light stimulation waveform using direct digital
synthesis. Measurement 2015, 75, 12–22. [CrossRef]
24. Ghasemi, A.; Zahediasl, S. Normality tests for statistical analysis: A guide for non-statisticians. Int. J.
Endocrinol. Metab. 2012, 10, 486–489. [CrossRef] [PubMed]
25. De La Fuente, J.L.M. General Podology and Biomechanics; Elsevier: Barcelona, Spain, 2009.
26. Drapé, J.-L. Diagnostic Imaging of Foot Conditions; Elsevier: Barcelona, Spain, 2000.
27. Núñez-Samper, M.; Alcázar, L.F.L. Biomechanics, Medicine and Surgery of the Foot; Elsevier: Barcelona, Spain, 2007.
28. Lawrence, I.; Lin, K. A concordance correlation coefficient to evaluate reproducibility. Biometrics 1989, 45,
255–268.
29. Fascione, J.M.; Crews, R.T.; Wrobel, J.S. Dynamic footprint measurement collection technique and intrarater
reliability: Ink mat, paper pedography, and electronic pedography. J. Am. Podiatr. Med. Assoc. 2012, 102,
130–138. [CrossRef] [PubMed]
30. Su, K.-H.; Kaewwichit, T.; Tseng, C.-H.; Chang, C.-C. Automatic footprint detection approach for the
calculation of arch index and plantar pressure in a flat rubber pad. Multimed. Tools Appl. 2016, 75, 9757–9774.
[CrossRef]

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like