0% found this document useful (0 votes)
50 views27 pages

Comp 1942 finalExamSol-2020

The document provides the suggested answers for an online final exam for COMP1942 Exploring and Visualizing Data. It includes multiple choice and written questions on frequent pattern mining and FP-tree algorithms with sample data and output.

Uploaded by

Karin Wong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views27 pages

Comp 1942 finalExamSol-2020

The document provides the suggested answers for an online final exam for COMP1942 Exploring and Visualizing Data. It includes multiple choice and written questions on frequent pattern mining and FP-tree algorithms with sample data and output.

Uploaded by

Karin Wong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

COMP1942 Suggested Answer

COMP1942 Exploring and Visualizing Data (Spring Semester 2020)


Online Final Examination (Suggested Answer)
Date: 24 May, 2020 (Sunday)
Time: 6:30pm-8:30pm
Duration: 2 hours

Part A
Q1 (15 Marks) (Version 1)

Item Freq
a 3
b 5
c 7
d 7
e 1
f 7
g 1
h 1
i 1
j 1
k 1
l 1
m 1
n 1
o 1
p 1
q 1
r 1
s 1
t 1

Freq items:
Item Freq
a 3
b 5
c 7
d 7
f 7

Sorted Freq items:


Item Freq
c 7
d 7
f 7
b 5
a 3
1/27
COMP1942 Suggested Answer

Ordered freq items


TID Items bought (ordered) freq items
1 b, d, f, r d,f,b
2 b, c, d, s c,d,b
3 c, m, t c
4 b, d, f d,f,b
5 a, d, f d,f,a
6 e, f f
7 f,h f
8 b,d,c c,d,b
9 a,l a
10 c, g c
11 c, k c
12 f, n, o f
13 b, c, d, p c,d,b
14 f, j, q f
15 c,i c
16 a,d d,a

FP-tree root

Item Head of link


c a:1
d c:7
d: 4 f:4

f
b a:1
d:3 f: 3
a
a:1
b:3
b:2

2/27
COMP1942 Suggested Answer
Conditional FP-tree on “a” count (a)=3
(d:1,f:1,a:1) (a:1,d:1)
(d:1,a:1) ⇒ (a:1,d:1)
(a:1) (a:1)
Item Freq
c 0
d 2
f 1
b 0
a 3

Item Freq
a 3
d 2
root

Item Head {a,d}:2


d
d:2

Conditional FP-tree on “b” count (b)=5


(c:3,d:3,b:3) (b:3,d:3,c:3)
(d:2,f:2,b:2) ⇒ (b:2,d:2,f:2)

Item Freq
c 3
d 5
f 2
b 5
a 0

Item Freq
b 5
d 5
c 3
f 2 root

Item Head
d
d:5
c
f c:3

f:2

3/27
COMP1942 Suggested Answer
Conditional FP-tree on “b,f” count(b,f) = 2
(d:2,f:2) ⇒ (d:2,f:2)
root

Item Freq
d 2 d:2

{b,f}=2
{b,d,f}=2

Conditional FP-tree on “b,c” count(b,c) = 3


(d:3,c:3) ⇒ (c:3,d:3)
root

Item Freq
d 3 d:3

{b,c}=3
{b,c,d}=3

Conditional FP-tree on “b,d” count(b,d) = 5


(d:5)

root {b,d}:5

4/27
COMP1942 Suggested Answer

Conditional FP-tree on “f” count (f)=7


(f:4) ⇒ (f:4)
(d:3,f:3) (f:3,d:3)
Item freq
f 7 {d,f}:3
d 3 root

Item Head
d
d:3

Conditional FP-tree on “d” count (d)=7


(c:3,d:3) (d:3,c:3)

(d:4) (d:4)

Item Freq
c 3
d 7
f 0
b 0
a 0

Item Freq
d 7
{c,d}:3
c 3 root

Item Head
c
c:3

Conditional FP-tree on “c” count (c)=7


(c:7) ⇒ (c:7)
Item freq
c 7 root

Item freq
c 7

Freq itemsets
={{c},{b,d}, {c,d}
{d},{a,d}, {b,c,d}
{f},{d,f}, {b, d, f},
{b}, {b,c}
{a},{b,f}}

5/27
COMP1942 Suggested Answer

Q1 (15 Marks) (Version 2)

Item Freq
a 1
b 1
c 1
d 1
e 1
f 1
g 1
h 1
i 1
j 1
k 1
l 1
r 3
s 5
t 6
u 7
v 1
w 7
x 1
y 1
z 1

Freq items:
Item Freq
r 3
s 5
t 6
u 7
w 7

Sorted Freq items:


Item Freq
u 7
w 7
t 6
s 5
r 3

6/27
COMP1942 Suggested Answer

Ordered freq items


TID Items bought (ordered) freq items
1 s, u, w, i u,w,s
2 s, t, u, j u,t,s
3 t, d, k t
4 s, u, w u,w,s
5 r, u, w u,w,r
6 v, w w
7 w, y w
8 s, u, l u,s
9 r, c r
10 t, x t
11 t, b t
12 w, e, f w
13 s, t, u, g u,t,s
14 w, a, h w
15 t,z t
16 r,u u,r

FP-tree root

Item Head of link


u u:7
w:4
w t:4 r:1

t w:3 s:1 r:1


s
t:2
r
s:2
r:1 s:2

7/27
COMP1942 Suggested Answer

Conditional FP-tree on “r” count (r)=3


(u:1,r:1) (r:1,u:1)
(u:1,w:1,r:1) ⇒ (r:1,u:1)
(r:1) (r:1)

Item Freq
u 2
w 0
t 0
s 0
r 3

Item Freq
r 3
u 2 root

Item Head {r,u}:2


u
u:2

Conditional FP-tree on “s” count (s)=5


(u:1,s:1) (s:1,u:1)
(u:2,w:2.s:2) ⇒ (s:2,u:2,w:2)
(u:2,t:2,s:2) (s:2,u:2,t:2)
Item Freq
u 5
w 2
t 2
s 5
r 0

Item Freq
s 5
u 5
t 2
w 2
root

Item Head
u
u:5
t
w
w:2 t:2

8/27
COMP1942 Suggested Answer

Conditional FP-tree on “u,s” count (u,s)=5


(u:5)
root {u,s}:5

Conditional FP-tree on “s,w” count (s,w)=2


(u,w:2)
↓ root

Item head
u {w,s}:2
u:2
{u,s,w}:2

Conditional FP-tree on “s,t” count (s,t)=2


(u,t:2)
root
Item head
u {t,s}:2
u:2
{u,s,t}:2

Conditional FP-tree on “t” count (t)=6


(t:4) ⇒ (t:4)
(u:2,t:2) (t:2,u:2)
root

Item head {u,t}:2


U u:2

9/27
COMP1942 Suggested Answer
Conditional FP-tree on “w” count (w)=7
(u:3,w:3) ⇒ (w:3,u:3)
(w:4) (w:4)

root
Item head
u {u,w}:3
u:3

Conditional FP-tree on “u” count (u)=7

(u:7)
root

Freq itemsets=

{ {u},{w},{t},{s}{r},
{r,u},{u,s},{w,s},{t,s},{u,t},{u,w}
{u,s,w},{u,s,t}
}

10/27
COMP1942 Suggested Answer
Q2 (15 Marks)

Center (1, 2) (2, 0) (-10, -5) (-5, -2) (10, 12) (8, 6) (-8, -6) (2, 1)
Points 1 2 3 4 5 6 7 8
(1, 2) 1 0
(2, 0) 2 2.24 0
(-10, -5) 3 13.04 13 0
(-5, -2) 4 7.21 7.28 5.83 0
(10, 12) 5 13.45 14.42 26.25 20.52 0
(8, 6) 6 8.06 8.49 21.1 15.26 6.32 0
(-8, -6) 7 12.04 11.66 2.24 5 25.46 20 0
(2, 1) 8 1.41 1 13.42 7.62 13.6 7.81 12.21 0

Center (1, 2) (2, 0.5) (-10, -5) (-5, -2) (10, 12) (8, 6) (-8, -6)
Points 1 (28) 3 4 5 6 7
(1, 2) 1 0
(2, 0.5) (28) 1.8 0
(-10, -5) 3 13.04 13.2 0
(-5, -2) 4 7.21 7.43 5.83 0
(10, 12) 5 13.45 14.01 26.25 20.52 0
(8, 6) 6 8.06 8.14 21.1 15.26 6.32 0
(-8, -6) 7 12.04 11.93 2.24 5 25.46 20 0

Center (1.5, 1.25) (-10, -5) (-5, -2) (10, 12) (8, 6) (-8, -6)
Points (128) 3 4 5 6 7
(1.5, 1.25) (128) 0
(-10, -5) 3 13.09 0
(-5, -2) 4 7.27 5.83 0
(10, 12) 5 13.7 26.25 20.52 0
(8, 6) 6 8.05 21.1 15.26 6.32 0
(-8, -6) 7 11.95 2.24 5 25.46 20 0

Center (1.5, 1.25) (-9, -5.5) (-5, -2) (10, 12) (8, 6)
Points (128) (37) 4 5 6
(1.5, 1.25) (128) 0
(-9, -5.5) (37) 12.48 0
(-5, -2) 4 7.27 5.32 0
(10, 12) 5 13.7 25.83 20.52 0
(8, 6) 6 8.05 20.52 15.26 6.32 0

11/27
COMP1942 Suggested Answer

Center (1.5, 1.25) (-7, -3.75) (10, 12) (8, 6)


Points (128) (347) 5 6
(1.5, 1.25) (128) 0
(-7, -3.75) (347) 9.86 0
(10, 12) 5 13.7 23.17 0
(8, 6) 6 8.05 17.89 6.32 0

Center (1.5, 1.25) (-7, -3.75) (9, 9)


Points (128) (347) (56)
(1.5, 1.25) (128) 0
(-7, -3.75) (347) 9.86 0
(9, 9) (56) 10.78 20.46 0

Center (-2.75, -1.25) (9, 9)


Points (123478) (56)
(-2.75, -1.25) (123478) 0
(9, 9) (56) 15.59 0

Dendrogram (not to scale)

12/27
COMP1942 Suggested Answer
Q3 (15 Marks)

(a)
Yes.
P(LC = Yes | FH = Yes, S = yes, PR = No) = 0.4375

P(PR = No | LC = Yes, FH = Yes, S = Yes) * P(LC = Yes | FH = Yes, S = Yes) = 0.4375


P(PR = No | FH = Yes, S = Yes)

P(PR = No | LC = Yes, FH = Yes, S = Yes)


= P(PR = No | LC = Yes) (conditional independence)
= (1 - x)

P(LC = Yes | FH = Yes, S = Yes) = 0.7

P(PR = No | FH = Yes, S = Yes)


= P(PR = No | LC = Yes) * P(LC = Yes | FH = Yes, S = Yes) +
P(PR = No | LC = No) * P(LC = No | FH = Yes, S = Yes)
= (1 - x) * 0.7 + (1 - y) * 0.3

Thus, (1 - x) * 0.7 / ( (1 - x) * 0.7 + (1 - y) * 0.3 ) = 0.4375


According to x = 2y, we can solve the final result of x = 0.8, y = 0.4

(b) Disadvantages:
The Bayesian Belief network classifier requires a predefined knowledge about the network.
The Bayesian Belief Network classifier cannot work directly when the network contains cycles.

13/27
COMP1942 Suggested Answer
Q4 (15 Marks) (Version 1)

29/4 7.25
mean vector = = =
29/4 7.25
6 − 7.25 −5/4 −1.25
For data (6, 6), difference from mean vector = = =
6 − 7.25 −5/4 −1.25
8 − 7.25 3/4 0.75
For data (8, 8), difference from mean vector = = =
8 − 7.25 3/4 0.75
5 − 7.25 −9/4 −2.25
For data (5, 11), difference from mean vector = = =
11 − 7.25 15/4 3.75
10 − 7.25 11/4 2.75
For data (10, 4), difference from mean vector = = =
4 − 7.25 −13/4 −3.25
−1.25 0.75 −2.25 2.75
𝑌=
−1.25 0.75 3.75 −3.25
−1.25 −1.25
1 1 −1.25 0.75 −2.25 2.75 0.75 0.75
𝛴 = 𝑌𝑌 =
4 4 −1.25 0.75 3.75 −3.25 −2.25 3.75
2.75 −3.25
1 14.75 −15.25
=
4 −15.25 26.75
59 61

= 16 16 = 3.6875 −3.8125
61 107 −3.8125 6.6875

16 16

3.6875 − 𝜆 −3.8125
=0 ⇒ 𝜆 − 10.375𝜆 + 10.125 = 0
−3.8125 6.6875 − 𝜆
√ √
⇒ 𝜆= = 9.2845 or 𝜆 = = 1.0905


When 𝜆 = ,
59 83 + √4297 61
− −
16 𝑥 −61 𝑥 0
⎛16 16 ⎞ ⇒ −24 − √4297 =
61 107 83 + √4297 𝑥 −61 24 − √4297 𝑥 0
− −
⎝ 16 16 16 ⎠

⇒ 𝑥 + 0.6812𝑥 = 0
𝑥 0.5630
We choose the eigenvector of unit length: 𝑥 = .
−0.8265

when 𝜆 = ,
√ −
− 𝑥 −61 𝑥 0
𝑥 ⇒ −24 + √4297 𝑥 =
− −
√ −61 24 + √4297 0
⇒ 𝑥 − 1.4681𝑥 = 0
𝑥 −0.8265
We choose the eigenvector of unit length: 𝑥 = .
−0.5630

14/27
COMP1942 Suggested Answer
Order can be interchangeable from left-to-right
0.5630 −0.8265 0.5630 −0.8265
Thus, 𝛷 = ,𝑌=𝛷 𝑋= 𝑋.
−0.8265 −0.5630 −0.8265 −0.5630

0.5630 −0.8265 6 −1.5810


For data (6, 6), 𝑌 = =
−0.8265 −0.5630 6 −8.3367
0.5630 −0.8265 8 −2.1080
For data (8, 8), 𝑌 = =
−0.8265 −0.5630 8 −11.1156
0.5630 −0.8265 5 −6.2764
For data (5, 11), 𝑌 = =
−0.8265 −0.5630 11 −10.3251
0.5630 −0.8265 10 2.3238
For data (10, 4), 𝑌 = =
−0.8265 −0.5630 4 −10.5166
−1.9104
The mean vector of the above transformed data points is =
−10.0735
The final transformed data points are:
−1.5810 −1.9104 0.3294
For data (6, 6), final transformed vector = − =
−8.3367 −10.0735 1.7368
−2.1080 −1.9104 −0.1976
For data (8, 8), final transformed vector = − =
−11.1156 −10.0735 −1.0421
−6.2764 −1.9104 −4.3660
For data (5, 11), final transformed vector = − =
−10.3251 −10.0735 −0.2516
2.3238 −1.9104 4.2342
For data (10, 4), final transformed vector = − =
−10.5166 −10.0735 −0.4431

Thus, (6, 6) is reduced to (0.3294);


(8, 8) is reduced to (−0.1976);
(5, 11) is reduced to (−4.3660);
(10, 4) is reduced to (4.2342).

(Note: Another possible answer is


(6, 6) is reduced to (−0.3294);
(8, 8) is reduced to (0.1976);
(5, 11) is reduced to (4.3660);
(10, 4) is reduced to (−4.2342).
This is because the eigenvectors used in this case are:
𝑥 −0.5630 𝑥 0.8265
𝑥 = 0.8265 and 𝑥 = 0.5630

15/27
COMP1942 Suggested Answer

Q4 (15 Marks) (Version 2)

87/4 21.75
mean vector = = =
87/4 21.75
18 − 21.75 −3.75
For data (18, 18), difference from mean vector = =
18 − 21.75 −3.75
24 − 21.75 2.25
For data (24, 24), difference from mean vector = =
24 − 21.75 2.25
15 − 21.75 −6.75
For data (15, 33), difference from mean vector = =
33 − 21.75 11.25
30 − 21.75 8.25
For data (30, 12), difference from mean vector = =
12 − 21.75 −9.75
−1.25 0.75 −2.25 2.75
𝑌=
−1.25 0.75 3.75 −3.25
−3.75 −3.75
1 1 −3.75 2.25 −6.75 8.25 2.25 2.25
𝛴 = 𝑌𝑌 =
4 4 −3.75 2.25 11.25 −9.75 −6.75 11.25
8.25 −9.75
1 132.75 −137.25
=
4 −137.25 240.75
33.1875 −34.3125
=
−34.3125 60.1875

33.1875 − 𝜆 −34.3125
=0
−34.3125 60.1875 − 𝜆
√ √
⇒ 𝜆= = 83.5602 or 𝜆 = = 9.8148

When 𝜆 = 83.5602,
33.1875 − 83.5602 −34.3125 𝑥 −50.3727 −34.3125 𝑥 0
𝑥 ⇒ 𝑥 =
−34.3125 60.1875 − 83.5602 −34.3125 −23.3727 0

⇒ 𝑥 + 0.6812𝑥 = 0

𝑥 0.5630
We choose the eigenvector of unit length: 𝑥 = .
−0.8265

when 𝜆 = 9.8148,
33.1875 − 9.8148 −34.3125 𝑥 23.3727 −34.3125 𝑥 0
−34.3125 60.1875 − 9.8148 𝑥 ⇒ −34.3125 50.3727 𝑥 =
0
⇒ 𝑥 − 1.4681𝑥 = 0
𝑥 −0.8265
We choose the eigenvector of unit length: 𝑥 = .
−0.5630

0.5630 −0.8265 0.5630 −0.8265


Thus, 𝛷 = ,𝑌=𝛷 𝑋= 𝑋
−0.8265 −0.5630 −0.8265 −0.5630
0.5630 −0.8265 18 −4.7431
For data (18, 18), 𝑌 = =
−0.8265 −0.5630 18 −25.0101

16/27
COMP1942 Suggested Answer
0.5630 −0.8265 24 −6.3241
For data (24, 24), 𝑌 = =
−0.8265 −0.5630 24 −33.3468
0.5630 −0.8265 15 −18.8291
For data (15, 33), 𝑌 = =
−0.8265 −0.5630 33 −30.9752
0.5630 −0.8265 30 6.9715
For data (30, 14), 𝑌 = =
−0.8265 −0.5630 12 −31.5499

−5.7312
The mean vector of the above transformed data points is =
−30.2205
The final transformed data points are:

−4.7431 −5.7312 0.9881


For data (18, 18), final transformed vector = − =
−25.0101 −30.2205 5.2104

−6.3241 −5.7312 −0.5929


For data (24, 24), final transformed vector = − =
−33.3468 −30.2205 −3.1263
−18.8291 −5.7312 −13.0979
For data (15, 33), final transformed vector = − =
−30.9752 −30.2205 −0.7547

6.9715 −5.7312 12.7027


For data (30, 14), final transformed vector = − =
−31.5499 −30.2205 −1.3294

Thus, (18, 18) is reduced to (0.9881);


(24, 24) is reduced to (−0.5929);
(15, 33) is reduced to (−13.0979);
(30, 14) is reduced to (12.7027).

(Note: Another possible answer is


(18, 18) is reduced to (−0.9881);
(24, 24) is reduced to (0.5929);
(15, 33) is reduced to (13.0979);
(30, 14) is reduced to (−12.7027).
This is because the eigenvectors used in this case are:
𝑥 −0.5630 𝑥 0.8265
𝑥 = and 𝑥 = 0.5630
0.8265

17/27
COMP1942 Suggested Answer
Q5 (15 Marks)
(a)
Yes. The number is 3. By looking at the second portion of the chart, we have 10 – 7 = 3

(b)
Yes. The number is 7. By looking at the first portion of the chart, we have 7 – 0 = 7

(c)
Yes. The number is 6. By looking at the third portion of the chart, we have 16 – 10 = 6

(d)
Yes. The number is 14. By looking at the fourth portion of the chart, we have 30 – 16 = 14

(e)
Yes. The chart is shown as follows.

Decile mean/
Global mean

2.33 2.33
3
1.55
2
0.77 0.77
1

0 1 2 3 4 5 6 7 8 9 10 Decile

Global mean = 13/30


By dividing the data into ten deciles, we have the value of each decile to be Decile mean/Global mean.

(f)

Yes.

Precision = True Positive/ (True Positive + False Positive) = 7/(7+3) = 7/10

Recall = True Positive/ (True Positive + False Negative) = 7/(7+6) = 7/13

F1-score = (2 x Precision x Recall)/(Precision + Recall) = (2 x 7/10 x 7/13)/(7/10 + 7/13) = 14/23 = 0.609

18/27
COMP1942 Suggested Answer
Q6 (15 Marks)

First Choice Second Choice Third Choice Forth Choice


b 700
c 350 50 50 50
d 640 320
e 360 60 60 20
f 390 90 90 30
g 580 180 180
h 330 230 70 60
i 320 320 160 160
j 190 90 90 45
k 185 85 85 40
l 180 80 80 35
m 195 195 115 115

Resulting views = {b, d, g, i}

First Choice Second Choice Third Choice Forth Choice


b 700
c 350 50 50 50
d 640 320 160 160
e 360 60 60 20
f 390 90 90 30
g 580 180 180
h 330 230 120 110
i 320 320
j 190 90 90 45
k 185 85 85 40
l 180 80 80 35
m 195 195 35 35

Resulting views = {b, i, g, d}

19/27
COMP1942 Suggested Answer
Q7 (15 Marks)
(a)
ID x1 x2 y
a 13 13 1
b 19 9 1
c 21 15 1
d 15 19 1
e 7 9 -1
f 5 7 -1
g 9 9 -1
h 7 5 -1

(b)
minimize w12+w22
subject to
13w1 + 13w2 + b  1
19w1 + 9w2 + b  1
21w1 + 15w2 + b  1
15w1 + 19w2 + b  1
-7w1 – 9w2 – b  1
-5w1 – 7w2 – b  1
-9w1 -7w2 – b  1
-7w1 –5w2 – b  1
w1, w2 and b are real numbers

(c)
net=13w1 +13w1 +b=2.7
y=0.9910
w1 = 0.1+0.5*(1-0.9910)*13=0.1585
w2 = 0.1+0.5*(1-0.9910)*13=0.1585
b = 0.1+0.5*(1-0.9910)=0.1045

net=19w1 +9w2 +b=4.5425


y=0.9998
w1 = 0.1585+0.5*(1-0.9998)*19=0.1604
w2 = 0.1585+0.5*(1-0.9998)*9=0.1594
b = 0.1045+0.5*(1-0.9998)=0.1046
20/27
COMP1942 Suggested Answer

net=21w1 +15w2 +b=5.864


y=1
w1 = 0.1604+0.5*(1-1)*21=0.1604
w2 = 0.1594+0.5*(1-1)*15=0.1594
b = 0.1046+0.5*(1-1)=0.1046

net=15w1 +19w2 +b=5.5392


y=1
w1 = 0.1604+0.5*(1-1)*15=0.1604
w2 = 0.1594+0.5*(1-1)*19=0.1594
b = 0.1046+0.5*(1-1)=0.1046

net=7w1 +9w2 +b=2.662


y=0.9903
w1 = 0.1604+0.5*(-1-0.9903)*7=-6.8057
w2 = 0.1594+0.5*(-1-0.9903)*9=-8.7970
b = 0.1046+0.5*(-1-0.9903)=-0.8906

net=5w1 +7w2 +b=-96.4981


y=-1
w1 = -6.8057+0.5*(-1+1)*5=-6.8057
w2 =-8.7970+0.5*(-1+1)*7=-8.7970
b = -0.8906+0.5*(-1+1)= -0.8906

(d)
The neural network has an assumption that records in the training set are “independent”.
In some cases, records in the training set are related/correlated to (or dependent on) other records in the
training set. Thus, the neural network could not capture this “dependent” scenario well if the training set has
dependent records.

21/27
COMP1942 Suggested Answer

Q8 (15 Marks)
(a)
Adjacency matrix

(b)
Stochastic matrix
x y z
x 0 ½ ½
y ½ ½ ½
z ½ 0 0

(c)
1. Site x has to remove the link from site x to site y
2. Site x has to remove the link from site x to z
3. [Optional] Site x has to create a link from site x to itself

(d)

rn = 0.8 M r0 + c
 rn ,1   m11 m12 m13   r0 ,1   0.2 
       
 rn , 2  = 0.8  m21 m22 m23   r0, 2  +  0.2 
r  m m32 m33   r   0.2 
 n ,3   31  0,3   

 0.8(m11r0,1  m12 r0 ,2  m13 r0,3 )  0.2 


 
=  0.8(m21 r0,1  m22 r0 ,2  m23 r0 ,3 )  0.2 
 0.8(m r  m r  m r )  0.2 
 31 0 ,1 32 0 ,2 33 0 , 3 

Sum of the values in rn


= 0.8( m11 r0 ,1  m12 r0 ,2  m13 r0 , 3  m 21 r0 ,1  m 22 r0 ,2  m 23 r0 , 3  m 31 r0 ,1  m32 r0 ,2  m 33 r0 , 3 )+(0.2+0.2+0.2)
= 0.8[ ( m11  m 21  m 31 ) r0 ,1  ( m12  m 22  m 32 ) r0 ,2  ( m13  m 23  m 33 ) r0 , 3 ]+0.2 x 3
= 0.8( 1  r0 ,1  1  r0 ,2  1  r0 , 3 )+0.2 x 3
= 0.8( r0 ,1  r0 ,2  r0 , 3 ) + 0.2 x 3
= 0.8 x 3 + 0.2 x 3
=3

22/27
COMP1942 Suggested Answer

Part B
Version 1

Question Your Answer


Q9 C
Q10 E
Q11 B
Q12 B
Q13 A
Q14 A
Q15 A
Q16 E
Q17 C
Q18 A
Q19 B
Q20 E
Q21 B
Q22 B
Q23 A
Q24 B

Version 2

Question Your Answer


Q9 R
Q10 T
Q11 Q
Q12 Q
Q13 P
Q14 P
Q15 P
Q16 T
Q17 Q
Q18 P
Q19 Q
Q20 R
Q21 P
Q22 Q
Q23 T
Q24 Q

23/27
COMP1942 Suggested Answer
Version 3

Question Your Answer


Q9 X
Q10 Z
Q11 W
Q12 W
Q13 V
Q14 V
Q15 V
Q16 Z
Q17 W
Q18 W
Q19 V
Q20 W
Q21 X
Q22 V
Q23 W
Q24 Z

Version 4

Question Your Answer


Q9 B
Q10 E
Q11 C
Q12 B
Q13 D
Q14 A
Q15 A
Q16 E
Q17 B
Q18 B
Q19 A
Q20 B
Q21 C
Q22 A
Q23 B
Q24 E

24/27
COMP1942 Suggested Answer
Version 5

Question Your Answer


Q9 Q
Q10 T
Q11 R
Q12 Q
Q13 S
Q14 P
Q15 P
Q16 T
Q17 R
Q18 P
Q19 Q
Q20 T
Q21 Q
Q22 Q
Q23 P
Q24 Q

Version 6

Question Your Answer


Q9 W
Q10 Z
Q11 X
Q12 W
Q13 Y
Q14 V
Q15 V
Q16 Z
Q17 W
Q18 V
Q19 W
Q20 X
Q21 V
Q22 W
Q23 Z
Q24 W

25/27
COMP1942 Suggested Answer
Version 7

Question Your Answer


Q9 C
Q10 E
Q11 A
Q12 B
Q13 A
Q14 A
Q15 E
Q16 E
Q17 B
Q18 A
Q19 B
Q20 C
Q21 A
Q22 B
Q23 E
Q24 B

Version 8

Question Your Answer


Q9 R
Q10 T
Q11 P
Q12 Q
Q13 P
Q14 P
Q15 T
Q16 T
Q17 Q
Q18 Q
Q19 P
Q20 Q
Q21 R
Q22 P
Q23 Q
Q24 T

26/27
COMP1942 Suggested Answer
Version 9

Question Your Answer


Q9 X
Q10 Z
Q11 V
Q12 W
Q13 V
Q14 V
Q15 Z
Q16 Z
Q17 X
Q18 V
Q19 W
Q20 Z
Q21 W
Q22 W
Q23 V
Q24 W
s
End of Paper

27/27

You might also like