0% found this document useful (0 votes)
143 views16 pages

I ' S M P S I: Nstructor S Olutions Anual Robability AND Tatistical Nference

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
143 views16 pages

I ' S M P S I: Nstructor S Olutions Anual Robability AND Tatistical Nference

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

INSTRUCTOR’S

SOLUTIONS MANUAL

P ROBABILITY
AND S TATISTICAL I NFERENCE
TENTH EDITION

Robert V. Hogg
Elliot A. Tanis
Dale L. Zimmerman
The author and publisher of this book have used their best efforts in preparing this book. These efforts include the
development, research, and testing of the theories and programs to determine their effectiveness. The author and publisher
make no warranty of any kind, expressed or implied, with regard to these programs or the documentation contained in this
book. The author and publisher shall not be liable in any event for incidental or consequential damages in connection with,
or arising out of, the furnishing, performance, or use of these programs.

Reproduced by Pearson from electronic files supplied by the author.

Copyright © 2020, 2015, 2010 Pearson Education, Inc.


Publishing as Pearson, 330 Hudson Street, NY NY 10013

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form
or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the
publisher. Printed in the United States of America.

ISBN-13: 978-0-13-518948-1
ISBN-10: 0-13-518948-9
iii

Contents

Preface v

1 Probability 1
1.1 Properties of Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Methods of Enumeration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Conditional Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Independent Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.5 Bayes’ Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 Discrete Distributions 7
2.1 Random Variables of the Discrete Type . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.2 Mathematical Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3 Special Mathematical Expectations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.4 The Binomial Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.5 The Hypergeometric Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.6 The Negative Binomial Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.7 The Poisson Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3 Continuous Distributions 19
3.1 Random Variables of the Continuous Type . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2 The Exponential, Gamma, and Chi-Square Distributions . . . . . . . . . . . . . . . . . 26
3.3 The Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.4 Additional Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

4 Bivariate Distributions 33
4.1 Bivariate Distributions of the Discrete Type . . . . . . . . . . . . . . . . . . . . . . . . 33
4.2 The Correlation Coefficient . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
4.3 Conditional Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.4 Bivariate Distributions of the Continuous Type . . . . . . . . . . . . . . . . . . . . . . 37
4.5 The Bivariate Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

5 Distributions of Functions of Random Variables 45


5.1 Functions of One Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.2 Transformations of Two Random Variables . . . . . . . . . . . . . . . . . . . . . . . . 47
5.3 Several Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
5.4 The Moment-Generating Function Technique . . . . . . . . . . . . . . . . . . . . . . . 53
5.5 Random Functions Associated with Normal Distributions . . . . . . . . . . . . . . . . 55
5.6 The Central Limit Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
5.7 Approximations for Discrete Distributions . . . . . . . . . . . . . . . . . . . . . . . . . 59
5.8 Chebyshev’s Inequality and Convergence in Probability . . . . . . . . . . . . . . . . . 61
5.9 Limiting Moment-Generating Functions . . . . . . . . . . . . . . . . . . . . . . . . . . 62

c 2020 Pearson Education, Inc.


Copyright °
iv Contents

6 Point Estimation 63
6.1 Descriptive Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.2 Exploratory Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.3 Order Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
6.4 Maximum Likelihood Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
6.5 A Simple Regression Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
6.6 Asymptotic Distributions of Maximum
Likelihood Estimators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
6.7 Sufficient Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
6.8 Bayesian Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

7 Interval Estimation 87
7.1 Confidence Intervals for Means . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
7.2 Confidence Intervals for the Difference of Two Means . . . . . . . . . . . . . . . . . . . 88
7.3 Confidence Intervals For Proportions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
7.4 Sample Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
7.5 Distribution-Free Confidence Intervals for Percentiles . . . . . . . . . . . . . . . . . . . 92
7.6 More Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
7.7 Resampling Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

8 Tests of Statistical Hypotheses 107


8.1 Tests About One Mean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
8.2 Tests of the Equality of Two Means . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
8.3 Tests for Variances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
8.4 Tests about Proportions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
8.5 Some Distribution-Free Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
8.6 Power of a Statistical Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
8.7 Best Critical Regions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
8.8 Likelihood Ratio Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

9 More Tests 127


9.1 Chi-Square Goodness-of-Fit Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
9.2 Contingency Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
9.3 One-Factor Analysis of Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
9.4 Two-Way Analysis of Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
9.5 General Factorial and 2k Factorial Designs . . . . . . . . . . . . . . . . . . . . . . . . . 135
9.6 Tests Concerning Regression and Correlation . . . . . . . . . . . . . . . . . . . . . . . 136
9.7 Statistical Quality Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137

c 2020 Pearson Education, Inc.


Copyright °
Preface v

Preface

This solutions manual provides answers for the even-numbered exercises in Probability and Statistical
Inference, tenth edition, by Robert V. Hogg, Elliot A. Tanis, and Dale L. Zimmerman. Complete
solutions are given for most of these exercises. You, the instructor, may decide how many of these
solutions and answers you want to make available to your students. Note that the answers for the
odd-numbered exercises are given in the textbook. Our hope is that this solutions manual will be
helpful to each of you in your teaching.
All of the figures in this manual were generated using Maple, a computer algebra system. Most
of the figures were generated and many of the solutions, especially those involving data, were solved
using procedures that were written by Zaven Karian from Denison University. We thank him for
providing these. These procedures are available free of charge for your use. They are available for
down load at http://www.math.hope.edu/tanis/. Short descriptions of these procedures are provided
on the “Maple Card.” Complete descriptions of these procedures are given in Probability and Statistics:
Explorations with MAPLE, second edition, 1999, written by Zaven Karian and Elliot Tanis, published
by Prentice Hall (ISBN 0-13-021536-8). You can download a slightly revised edition of this manual
at http://www.math.hope.edu/tanis/MapleManual.pdf.
We also want to acknowledge the many suggestions/corrections that were made by our accuracy
checker, Kyle Siegrist.
If you find an error or wish to make a suggestion, please send them to [email protected].
These errata will be posted on http://homepage.divms.uiowa.edu/∼dzimmer/.

E.A.T.
D.L.Z.

c 2020 Pearson Education, Inc.


Copyright °
Chapter 1 Probability 1

Chapter 1

Probability

1.1 Properties of Probability


1.1-2 Sketch a figure and fill in the probabilities of each of the disjoint sets.
Let A = {insure more than one car}, P (A) = 0.85.
Let B = {insure a sports car}, P (B) = 0.23.
Let C = {insure exactly one car}, P (C) = 0.15.
It is also given that P (A ∩ B) = 0.17. Since A ∩ C = φ, P (A ∩ C) = 0. It follows that
P (A ∩ B ∩ C 0 ) = 0.17. Thus P (A0 ∩ B ∩ C 0 ) = 0.06 and P (B 0 ∩ C) = 0.09.

1.1-4 (a) S = {HHHH, HHHT, HHTH, HTHH, THHH, HHTT, HTTH, TTHH,
HTHT, THTH, THHT, HTTT, THTT, TTHT, TTTH, TTTT};
(b) (i) 5/16, (ii) 0, (iii) 11/16, (iv) 4/16, (v) 4/16, (vi) 9/16, (vii) 4/16.

1.1-6 (a) P (A ∪ B) = 0.5 + 0.6 − 0.4 = 0.7;


(b) A = (A ∩ B 0 ) ∪ (A ∩ B)
P (A) = P (A ∩ B 0 ) + P (A ∩ B)
0.5 = P (A ∩ B 0 ) + 0.4
0
P (A ∩ B ) = 0.1;

(c) P (A0 ∪ B 0 ) = P [(A ∩ B)0 ] = 1 − P (A ∩ B) = 1 − 0.4 = 0.6.

1.1-8 Let A ={lab work done}, B ={referral to a specialist},


P (A) = 0.41, P (B) = 0.53, P ([A ∪ B]0 ) = 0.21.
P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
0.79 = 0.41 + 0.53 − P (A ∩ B)
P (A ∩ B) = 0.41 + 0.53 − 0.79 = 0.15.

1.1-10 A∪B∪C = A ∪ (B ∪ C)
P (A ∪ B ∪ C) = P (A) + P (B ∪ C) − P [A ∩ (B ∪ C)]
= P (A) + P (B) + P (C) − P (B ∩ C) − P [(A ∩ B) ∪ (A ∩ C)]
= P (A) + P (B) + P (C) − P (B ∩ C) − P (A ∩ B) − P (A ∩ C)
+ P (A ∩ B ∩ C).

1.1-12 (a) 1/3; (b) 2/3; (c) 0; (d) 1/2.

c 2020 Pearson Education, Inc.


Copyright °
2 Section 1.2 Methods of Enumeration

√ √
2[r − r( 3/2)] 3
1.1-14 P (A) = =1− .
2r 2
1.1-16 Note that the respective probabilities are p0 , p1 = p0 /4, p2 = p0 /42 , · · ·.
X∞
p0
= 1
4k
k=0
p0
= 1
1 − 1/4
3
p0 =
4
15 1
1 − p 0 − p1 = 1 − = .
16 16

1.2 Methods of Enumeration


1.2-2 (a) (4)(5)(2) = 40; (b) (2)(2)(2) = 8.
µ ¶
6
1.2-4 (a) 4 = 80;
3

(b) 4(26 ) = 256;

(4 − 1 + 3)!
(c) = 20.
(4 − 1)!3!
1.2-6 S ={ DDD, DDFD, DFDD, FDDD, DDFFD, DFDFD, FDDFD, DFFDD,
FDFDD, FFDDD, FFF, FFDF, FDFF, DFFF FFDDF, FDFDF,
DFFDF, FDDFF, DFDFF, DDFFF } so there are 20 possibilities. Note that the
winning player (2 choices) must win the last set and two of the previous sets, so the
number of outcomes is ·µ ¶ µ ¶ µ ¶¸
2 3 4
2 + + = 20.
2 2 2

1.2-8 3 · 3 · 212 = 36,864.


µ ¶ µ ¶
n−1 n−1 (n − 1)! (n − 1)!
1.2-10 + = +
r r−1 r!(n − 1 − r)! (r − 1)!(n − r)!
µ ¶
(n − r)(n − 1)! + r(n − 1)! n! n
= = = .
r!(n − r)! r!(n − r)! r
Xn µ ¶ X n µ ¶
n n
1.2-12 0 = (1 − 1)n = (−1)r (1)n−r = (−1)r .
r=0
r r=0
r

n µ ¶
X n µ ¶
X
n n n r n−r n
2 = (1 + 1) = (1) (1) = .
r=0
r r=0
r
µ ¶
5 − 1 + 29 33!
1.2-14 = = 40,920.
29 29!4!
µ ¶µ ¶
19 52 − 19
3 6 102,486
1.2-16 (a) µ ¶ = = 0.2917;
52 351,325
9

c 2020 Pearson Education, Inc.


Copyright °
Chapter 1 Probability 3

µ
¶µ ¶µ ¶µ ¶µ ¶µ ¶µ ¶
19
10 7 3 5 2 6
23 1 0 1 0 2 7,695
(b) µ ¶ = = 0.00622.
52 1,236,664
9
P5
1.2-18 (a) P (A) = n=1 (1/2)n = 1 − (1/2)5 ;
P10
(b) P (B) = n=1 (1/2)n = 1 − (1/2)10 ;
(c) P (A ∪ B) = P (B) = 1 − (1/2)10 ;
(d) P (A ∩ B) = P (A) = 1 − (1/2)5 ;
(e) P (C) = P (B) − P (A) = (1/2)5 − (1/2)10 ;
(f ) P (B 0 ) = 1 − P (B) = (1/2)10 .

1.3 Conditional Probability


1041
1.3-2 (a) ;
1456
392
(b) ;
633
649
(c) .
823

(d) The proportion of women who favor a gun law is greater than the proportion of men
who favor a gun law.
13 12 1
1.3-4 (a) P (HH) = · = ;
52 51 17
13 13 13
(b) P (HC) = · = ;
52 51 204

(c) P (Non-Ace Heart, Ace) + P (Ace of Hearts, Non-Heart Ace)


12 4 1 3 51 1
= · + · = = .
52 51 52 51 52 · 51 52
1.3-6 Let H ={died from heart disease}; P ={at least one parent had heart disease}.

N (H ∩ P 0 ) 110
P (H | P 0 ) = 0
= .
N (P ) 648

3 2 1 1
1.3-8 (a) · · = ;
20 19 18 1140
µ ¶µ ¶
3 17
2 1 1 1
(b) µ ¶ · = ;
20 17 380
3
µ ¶µ ¶
3 17
X9
2 2k − 2 1 35
(c) µ ¶ · = = 0.4605;
20 20 − 2k 76
k=1
2k

c 2020 Pearson Education, Inc.


Copyright °
4 Section 1.4 Independent Events

(d) Draw second. The probability of winning is 1 − 0.4605 = 0.5395.

52 51 50 49 48 47 8,808,975
1.3-10 (a) P (A) = · · · · · = = 0.74141;
52 52 52 52 52 52 11,881,376
(b) P (A0 ) = 1 − P (A) = 0.25859.
1 1 1
1.3-12 (a) It doesn’t matter because P (B1 ) = , P (B5 ) = , P (B18 ) = ;
18 18 18
2 1
(b) P (B) = = on each draw.
18 9
1.3-14 (a) 5 · 4 · 3 = 60;
(b) 5 · 5 · 5 = 125.
3 5 2 4 23
1.3-16 · + · = .
5 8 5 8 40

1.4 Independent Events


1.4-2 (a) P (A ∩ B) = P (A)P (B) = (0.3)(0.6) = 0.18;
P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
= 0.3 + 0.6 − 0.18
= 0.72;
P (A ∩ B) 0
(b) P (A|B) = = = 0.
P (B) 0.6
1.4-4 Proof of (b): P (A0 ∩ B) = P (B)P (A0 |B)
= P (B)[1 − P (A|B)]
= P (B)[1 − P (A)]
= P (B)P (A0 ).
Proof of (c): P (A0 ∩ B 0 ) = P [(A ∪ B)0 ]
= 1 − P (A ∪ B)
= 1 − P (A) − P (B) + P (A ∩ B)
= 1 − P (A) − P (B) + P (A)P (B)
= [1 − P (A)][1 − P (B)]
= P (A0 )P (B 0 ).
1.4-6 P [A ∩ (B ∩ C)] = P [A ∩ B ∩ C]
= P (A)P (B)P (C)
= P (A)P (B ∩ C).
P [A ∩ (B ∪ C)] = P [(A ∩ B) ∪ (A ∩ C)]
= P (A ∩ B) + P (A ∩ C) − P (A ∩ B ∩ C)
= P (A)P (B) + P (A)P (C) − P (A)P (B)P (C)
= P (A)[P (B) + P (C) − P (B ∩ C)]
= P (A)P (B ∪ C).
P [A0 ∩ (B ∩ C 0 )] = P (A0 ∩ C 0 ∩ B)
= P (B)[P (A0 ∩ C 0 ) | B]
= P (B)[1 − P (A ∪ C | B)]
= P (B)[1 − P (A ∪ C)]
= P (B)P [(A ∪ C)0 ]
= P (B)P (A0 ∩ C 0 )
= P (B)P (A0 )P (C 0 )
= P (A0 )P (B)P (C 0 )
= P (A0 )P (B ∩ C 0 ).

c 2020 Pearson Education, Inc.


Copyright °
Chapter 1 Probability 5

P [A0 ∩ B 0 ∩ C 0 ] = P [(A ∪ B ∪ C)0 ]


= 1 − P (A ∪ B ∪ C)
= 1 − P (A) − P (B) − P (C) + P (A)P (B) + P (A)P (C)+
P (B)P (C) − P A)P (B)P (C)
= [1 − P (A)][1 − P (B)][1 − P (C)]
= P (A0 )P (B 0 )P (C 0 ).

1 2 3 1 4 3 5 2 3 2
1.4-8 · · + · · + · · = .
6 6 6 6 6 6 6 6 6 9
3 3 9
1.4-10 (a) · = ;
4 4 16
1 3 3 2 9
(b) · + · = ;
4 4 4 4 16
2 1 2 4 10
(c) · + · = .
4 4 4 4 16
µ ¶3 µ ¶2
1 1
1.4-12 (a) ;
2 2
µ ¶3 µ ¶2
1 1
(b) ;
2 2
µ ¶3 µ ¶2
1 1
(c) ;
2 2
µ ¶3 µ ¶2
5! 1 1
(d) .
3! 2! 2 2

1.4-14 (a) 1 − (0.4)3 = 1 − 0.064 = 0.936;


(b) 1 − (0.4)8 = 1 − 0.00065536 = 0.99934464.

X µ ¶2k
1 4 5
1.4-16 (a) = ;
5 5 9
k=0
1 4 3 1 4 3 2 1 1 3
(b) + · · + · · · · = .
5 5 4 3 5 4 3 2 1 5
1.4-18 (a) 7; (b) (1/2)7 ; (c) 63; (d) No! (1/2)63 = 1/9,223,372,036,854,775,808.

1.4-20 No. The equations that must hold are

(1 − p1 )(1 − p2 ) = p1 (1 − p2 ) + p2 (1 − p1 ) = p1 p2 .

There are no real solutions.

1.5 Bayes’ Theorem


1.5-2 (a) P (G) = P (A ∩ G) + P (B ∩ G)
= P (A)P (G | A) + P (B)P (G | B)
= (0.40)(0.85) + (0.60)(0.75) = 0.79;
P (A ∩ G)
(b) P (A | G) =
P (G)
(0.40)(0.85)
= = 0.43.
0.79

c 2020 Pearson Education, Inc.


Copyright °
6 Section 1.5 Bayes’ Theorem

1.5-4 Let event B denote an accident and let A1 be the event that age of the driver is 16–25.
Then
(0.1)(0.05)
P (A1 | B) =
(0.1)(0.05) + (0.55)(0.02) + (0.20)(0.03) + (0.15)(0.04)
50 50
= = = 0.179.
50 + 110 + 60 + 60 280
1.5-6 Let B be the event that the policyholder dies. Let A1 , A2 , A3 be the events that the
deceased is standard, preferred and ultra-preferred, respectively. Then
(0.60)(0.01)
P (A1 | B) =
(0.60)(0.01) + (0.30)(0.008) + (0.10)(0.007)
60 60
= = = 0.659;
60 + 24 + 7 91
24
P (A2 | B) = = 0.264;
91
7
P (A3 | B) = = 0.077.
91
1.5-8 Let A be the event that the tablet is under warranty.
(0.40)(0.10)
P (B1 | A) =
(0.40)(0.10) + (0.30)(0.05) + (0.20)(0.03) + (0.10)(0.02)
40 40
= = = 0.635;
40 + 15 + 6 + 2 63
15
P (B2 | A) = = 0.238;
63
6
P (B3 | A) = = 0.095;
63
2
P (B4 | A) = = 0.032.
63
1.5-10 (a) P (D + ) = (0.02)(0.92) + (0.98)(0.05) = 0.0184 + 0.0490 = 0.0674;
0.0490 0.0184
(b) P (A− | D+ ) = = 0.727; P (A+ | D+ ) = = 0.273;
0.0674 0.0674
(0.98)(0.95) 9310
(c) P (A− | D− ) = = = 0.998;
(0.02)(0.08) + (0.98)(0.95) 16 + 9310
P (A+ | D− ) = 0.002;
(d) Yes, particularly those in part (b).

1.5-12 Let D = {defective roll}. Then


P (I ∩ D)
P (I | D) =
P (D)
P (I) · P (D | I)
=
P (I) · P (D | I) + P (II) · P (D | II)
(0.60)(0.03)
=
(0.60)(0.03) + (0.40)(0.01)
0.018 0.018
= = = 0.818.
0.018 + 0.004 0.022

c 2020 Pearson Education, Inc.


Copyright °
Chapter 2 Discrete Distributions 7

Chapter 2

Discrete Distributions

2.1 Random Variables of the Discrete Type


2.1-2 (a) 
 0.6, x = 1,
f (x) = 0.3, x = 5,

0.1, x = 10;

(b)
f(x)
0.6

0.5

0.4

0.3

0.2

0.1

x
1 2 3 4 5 6 7 8 9 10

Figure 2.1–2: A probability histogram

9
X µ ¶ 9
X
x+1
2.1-4 (a) log10 = [log10 (x + 1) − log10 x]
x=1
x x=1

= log10 2 − log10 1 + log10 3 − log10 2 + · · · + log10 10 − log10 9


= log10 10 = 1;
(b) 
 0, x < 1,
F (x) = log10 n, n − 1 ≤ x < n, n = 2, 3, . . . , 9,

1, 9 ≤ x.

c 2020 Pearson Education, Inc.


Copyright °
8 Section 2.1 Random Variables of the Discrete Type

1
2.1-6 (a) f (x) = , x = 0, 1, 2, · · · , 9;
10

(b) N ({0})/150 = 11/150 = 0.073; N ({5})/150 = 13/150 = 0.087;


N ({1})/150 = 14/150 = 0.093; N ({6})/150 = 22/150 = 0.147;
N ({2})/150 = 13/150 = 0.087; N ({7})/150 = 16/150 = 0.107;
N ({3})/150 = 12/150 = 0.080; N ({8})/150 = 18/150 = 0.120;
N ({4})/150 = 16/150 = 0.107; N ({9})/150 = 15/150 = 0.100.

(c)
f(x), h(x)

0.14

0.12

0.10

0.08

0.06

0.04

0.02

x
1 2 3 4 5 6 7 8 9

Figure 2.1–6: Michigan daily lottery digits

6 − |7 − x|
2.1-8 (a) f (x) = , x = 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12;
36
(b)
f(x)

0.16

0.14

0.12

0.10

0.08

0.06

0.04

0.02
x
1 2 3 4 5 6 7 8 9 10 11 12

Figure 2.1–8: Probability histogram for the sum of a pair of dice

c 2020 Pearson Education, Inc.


Copyright °
Chapter 2 Discrete Distributions 9

2.1-10 (a) The space of W is S = {0, 1, 2, 3, 4, 5, 6, 7}.


1 1 1
P (W = 0) = P (X = 0, Y = 0) = · = , assuming independence.
2 4 8
1 1 1
P (W = 1) = P (X = 0, Y = 1) = · = ,
2 4 8
1 1 1
P (W = 2) = P (X = 2, Y = 0) = · = ,
2 4 8
1 1 1
P (W = 3) = P (X = 2, Y = 1) = · = ,
2 4 8
1 1 1
P (W = 4) = P (X = 0, Y = 4) = · = ,
2 4 8
1 1 1
P (W = 5) = P (X = 0, Y = 5) = · = ,
2 4 8
1 1 1
P (W = 6) = P (X = 2, Y = 4) = · = ,
2 4 8
1 1 1
P (W = 7) = P (X = 2, Y = 5) = · = .
2 4 8
1
That is, f (w) = P (W = w) = , w ∈ S.
8

(b)
f (w)

0.12

0.10

0.08

0.06

0.04

0.02

w
1 2 3 4 5 6 7
Figure 2.1–10: Probability histogram of sum of two special dice

2.1-12 Let x equal the number of orange balls and 144 − x the number of blue balls. Then
x x − 1 144 − x 143 − x x 144 − x 144 − x x
· + · = · + ·
144 143 144 143 144 143 144 143
x2 − x + 144 · 143 − 144x − 143x + x2 = 2 · 144x − 2 · x2
x2 − 144x + 5,148 = 0
(x − 78)(x − 66) = 0
Thus there are 78 orange balls and 66 blue balls.

2.2 Mathematical Expectation


µ ¶ µ ¶ µ ¶
4 1 4
2.2-2 E(X) = (−1) + (0) + (1) = 0;
9 9 9
µ ¶ µ ¶ µ ¶
4 1 4 8
E(X 2 ) = (−1)2 + (0)2 + (1)2 = ;
9 9 9 9

c 2020 Pearson Education, Inc.


Copyright °

You might also like