100% found this document useful (3 votes)
72 views109 pages

Instructor S Manual To Accompany Introduction To Probability Models 9th Edition Sheldon M. Ross Ready To Read

Educational resource: Instructor s Manual to Accompany Introduction to Probability Models 9th Edition Sheldon M. Ross Instantly downloadable. Designed to support curriculum goals with clear analysis and educational value.

Uploaded by

ddixpwzll6873
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (3 votes)
72 views109 pages

Instructor S Manual To Accompany Introduction To Probability Models 9th Edition Sheldon M. Ross Ready To Read

Educational resource: Instructor s Manual to Accompany Introduction to Probability Models 9th Edition Sheldon M. Ross Instantly downloadable. Designed to support curriculum goals with clear analysis and educational value.

Uploaded by

ddixpwzll6873
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 109

Instructor s Manual to Accompany Introduction to

Probability Models 9th Edition Sheldon M. Ross


instant access 2025

https://ebookultra.com/download/instructor-s-manual-to-accompany-
introduction-to-probability-models-9th-edition-sheldon-m-ross/

★★★★★
4.7 out of 5.0 (81 reviews )

Download PDF Now

ebookultra.com
Instructor s Manual to Accompany Introduction to Probability
Models 9th Edition Sheldon M. Ross

EBOOK

Available Formats

■ PDF eBook Study Guide Ebook

EXCLUSIVE 2025 ACADEMIC EDITION – LIMITED RELEASE

Available Instantly Access Library


We have selected some products that you may be interested in
Click the link to download now or visit ebookultra.com
for more options!.

Introduction to Probability Models Tenth Edition Sheldon


M. Ross

https://ebookultra.com/download/introduction-to-probability-models-
tenth-edition-sheldon-m-ross/

Introduction to Probability and Statistics for Engineers


and Scientists Third Edition Sheldon M. Ross

https://ebookultra.com/download/introduction-to-probability-and-
statistics-for-engineers-and-scientists-third-edition-sheldon-m-ross/

Instructor s Manual to accompany Electric Machinery


Fundamentals 4th Edition Stephen Chapman

https://ebookultra.com/download/instructor-s-manual-to-accompany-
electric-machinery-fundamentals-4th-edition-stephen-chapman/

Instructor s Solutions Manual to accompany Trigtonometry


8th Edition Heidi A. Howard

https://ebookultra.com/download/instructor-s-solutions-manual-to-
accompany-trigtonometry-8th-edition-heidi-a-howard/
Instructor s Solutions Manual to Accompany Analytical
Mechanics 7th Edition Grant R. Fowles

https://ebookultra.com/download/instructor-s-solutions-manual-to-
accompany-analytical-mechanics-7th-edition-grant-r-fowles/

Student Problem Manual to accompany Corporate Finance 6th


Edition Stephen Ross

https://ebookultra.com/download/student-problem-manual-to-accompany-
corporate-finance-6th-edition-stephen-ross/

Instructor s Solutions Manual for Introduction to Analysis


4th Edition William R. Wade

https://ebookultra.com/download/instructor-s-solutions-manual-for-
introduction-to-analysis-4th-edition-william-r-wade/

Instructor s Solution Manuals to College Physics 9th


Edition Hugh D. Young

https://ebookultra.com/download/instructor-s-solution-manuals-to-
college-physics-9th-edition-hugh-d-young/

Instructor s Resource Manual for Athenaze An Introduction


to Ancient Greek I II 3rd Edition Maurice Balme

https://ebookultra.com/download/instructor-s-resource-manual-for-
athenaze-an-introduction-to-ancient-greek-i-ii-3rd-edition-maurice-
balme/
Instructor’s Manual to Accompany

Introduction to
Probability Models
Ninth Edition

Sheldon M. Ross
University of California
Berkeley, California

AMSTERDAM • BOSTON • HEIDELBERG • LONDON


NEW YORK • OXFORD • PARIS • SAN DIEGO
SAN FRANCISCO • SINGAPORE • SYDNEY • TOKYO
Academic Press is an imprint of Elsevier
Academic Press is an imprint of Elsevier
30 Corporate Drive, Suite 400, Burlington, MA 01803, USA
525 B Street, Suite 1900, San Diego, California 92101-4495, USA
84 Theobald’s Road, London WC1X 8RR, UK

Copyright 
c 2007, Elsevier Inc. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or by any means, electronic or
mechanical, including photocopy, recording, or any information storage and retrieval system, without
permission in writing from the publisher.

Permissions may be sought directly from Elsevier’s Science & Technology Rights Department in Oxford,
UK: phone: (+44) 1865 843830, fax: (+44) 1865 853333, E-mail: [email protected]. You may also
complete your request on-line via the Elsevier homepage (http://elsevier.com), by selecting “Support & Contact”
then “Copyright and Permission” and then “Obtaining Permissions.”

ISBN 13: 978-0-12-373875-2


ISBN 10: 0-12-373875-X

For information on all Academic Press publications


visit our Web site at www.books.elsevier.com

Printed in the United States of America


06 07 08 09 10 9 8 7 6 5 4 3 2 1
Contents

Chapter 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Chapter 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Chapter 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Chapter 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Chapter 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Chapter 6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Chapter 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Chapter 8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Chapter 9 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Chapter 10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Chapter 11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Chapter 1

1. S = {( R, R), ( R, G ), ( R, B), ( G, R), ( G, G ), ( G, B), 7. If ( E ∪ F )c occurs, then E ∪ F does not occur, and
( B, R), ( B, G ), ( B, B)}. so E does not occur (and so Ec does); F does not
The probability of each point in S is 1/9. occur (and so F c does) and thus Ec and F c both
occur. Hence,
2. S = {( R, G ), ( R, B), ( G, R), ( G, B), ( B, R), ( B, G )}. ( E ∪ F )c ⊂ Ec F c .

3. S = {(e1 , e2 , . . . , en ), n ≥ 2} where ei ∈ (heads, If Ec F c occurs, then Ec occurs (and so E does not),


tails}. In addition, en = en−1 = heads and for and F c occurs (and so F does not). Hence, neither
i = 1, . . . , n − 2 if ei = heads, then ei+1 = tails. E or F occur and thus ( E ∪ F )c does. Thus,

P{4 tosses} = P{(t, t, h, h)} + P{(h, t, h, h)} Ec F c ⊂ ( E ∪ F )c


 4 and the result follows.
1 1
=2 = .
2 8
8. 1 ≥ P( E ∪ F ) = P( E) + P( F ) − P( EF ).
4. (a) F ( E ∪ G )c = FEc G c .
9. F = E ∪ FEc , implying since E and FEc are disjoint
c
(b) EFG . that P( F ) = P( E) + P( FE)c .
(c) E ∪ F ∪ G.
10. Either by induction or use
(d) EF ∪ EG ∪ FG.
n
(e) EFG. ∪ Ei = E1 ∪ Ec1 E2 ∪ Ec1 Ec2 E3 ∪ · · · ∪ Ec1 · · · Ecn−1 En,
1
(f) ( E ∪ F ∪ G )c = Ec F c G c . and as each of the terms on the right side are
c c c
(g) ( EF ) ( EG ) ( FG ) . mutually exclusive:
(h) ( EFG )c . P(∪ Ei ) = P( E1 ) + P( Ec1 E2 ) + P( Ec1 Ec2 E3 ) + · · ·
i
+ P( Ec1 · · · Ecn−1 En )
3
5. . If he wins, he only wins $1, while if he loses, he ≤ P( E1 ) + P( E2 ) + · · · + P( En ). (why?)
4
loses $3. 

 i−1
 , i = 2, . . . , 7
6. If E( F ∪ G ) occurs, then E occurs and either F or G 36
11. P{sum is i } =
occur; therefore, either EF or EG occurs and so 
 13 − i
 , i = 8, . . . , 12.
36
E( F ∪ G ) ⊂ EF ∪ EG.
12. Either use hint or condition on initial outcome as:
Similarly, if EF ∪ EG occurs, then either EF or EG
occur. Thus, E occurs and either F or G occurs; and P{ E before F }
so E(F ∪ G) occurs. Hence, = P{ E before F | initial outcome is E} P( E)
EF ∪ EG ⊂ E( F ∪ G ), + P{ E before F | initial outcome is F } P( F )

which together with the reverse inequality proves + P{ E before F | initial outcome neither E
the result. or F }[1 − P( E) − P( F )]

1
2 Answers and Solutions

= 1 · P( E) + 0 · P( F ) + P{ E before F } 17. Prob{end} = 1 − Prob{continue}


= [1 − P( E) − P( F )]. = 1 − P({ H, H, H } ∪ { T, T, T })
P( E) = 1 − [Prob( H, H, H ) + Prob( T, T, T )].
Therefore, P{E before F} = .
P( E) + P( F )  
1 1 1 1 1 1
13. Condition an initial toss Fair coin: Prob{end} = 1 − · · + · ·
2 2 2 2 2 2
12 3
P{win} = ∑ P{win | throw i} P{throw i}. = .
4
i =2
 
Now, 1 1 1 3 3 3
Biased coin: P{end} = 1 − · · + · ·
4 4 4 4 4 4
P{win| throw i } = P{i before 7} 9

= .
16
 0 i = 2, 12




 i−1
 18. Let B = event both are girls; E = event oldest is

 5+1 i = 3, . . . , 6
girl; L = event at least one is a girl.
=

 1 i = 7, 11



 P( BE) P( B) 1/4 1
 (a) P( B| E) = = = = .
 13 − i

i = 8, . . . , 10, P( E) P( E) 1/2 2
19 − 1
1 3
where above is obtained by using Problems 11 (b) P( L) = 1 − P(no girls) = 1 − = ,
4 4
and 12.
P( BL) P( B) 1/4 1
P{win} ≈ .49. P( B| L) = = = = .
P( L) P( L) 3/4 3

14. P{ A wins} = ∑ P{ A wins on (2n + 1)st toss} 19. E = event at least 1 six P( E)
n=0

number of ways to get E 11
= ∑ (1 − P)2n P =
number of sample pts
=
36
n=0

=P ∑ [(1 − P)2 ]n D = event two faces are different P( D )
n=0
1 = 1 − Prob(two faces the same)
=P
1 − (1 − P)2 6 5 P( ED ) 10/36 1
P =1− = P( E| D ) = = = .
= 36 6 P( D ) 5/6 3
2P − P2
1
= . 20. Let E = event same number on exactly two of
2−P
P{ B wins} = 1 − P{ A wins} the dice; S = event all 3 numbers are the same;
D = event all 3 numbers are different. These
1−P
= . 3 events are mutually exclusive and define the
2−P whole sample space. Thus, 1 = P( D ) + P( S) +
P( E), P( S) = 6/216 = 1/36; for D have 6 possible
16. P( E ∪ F ) = P( E ∪ FEc ) values for first die, 5 for second, and 4 for third.
= P( E) + P( FEc )
∴ Number of ways to get D = 6 · 5 · 4 = 120.
since E and FEc are disjoint. Also,
P( D ) = 120/216 = 20/36
c
P( F ) = P( FE ∪ FE ) ∴ P( E) = 1 − P( D ) − P( S)
= P( FE) + P( FEc ) by disjointness. 20 1 5
=1− − = .
Hence, 36 36 12

P( E ∪ F ) = P( E) + P( F ) − P( EF ). 21. Let C = event person is color blind.


Answers and Solutions 3

P(Male|C ) (f) P4,3 = P{always ahead| a, a}(4/7)(3/6)


P(C |Male) P(Male) = (2/7)[1 − P{ a, a, a, b, b, b| a, a}
=
P(C |Male P(Male) + P(C |Female) P(Female) − P{ a, a, b, b| a, a} − P{ a, a, b, a, b, b| a, a}]
.05 × .5 = (2/7)[1 − (2/5)(3/4)(2/3)(1/2)
=
.05 × .5 + .0025 × .5
− (3/5)(2/4) − (3/5)(2/4)(2/3)(1/2)]
2500 20
= = . = 1/7.
2625 21
(g) P5,1 = P{ a, a} = (5/6)(4/5) = 2/3.
22. Let trial 1 consist of the first two points; trial 2 the (h) P5,2 = P{ a, a, a} + P{ a, a, b, a}
next two points, and so on. The probability that = (5/7)(4/6)[(3/5) + (2/5)(3/4)] = 3/7.
each player wins one point in a trial is 2p(1 − p). By the same reasoning we have that
Now a total of 2n points are played if the first
(a − 1) trials all result in each player winning one (i) P5,3 = 1/4,
of the points in that trial and the nth trial results in (j) P5,4 = 1/9.
one of the players winning both points. By inde- n−n
pendence, we obtain that (k) In all the cases above, Pn,m = .
n+n

P{2n points are needed} 25. (a) P{ pair} = P{second card is same
= (2p(1 − p)) n−1 2
( p + (1 − p) ),2
n ≥ 1. denomination as first}
= 3/51.
The probability that A wins on trial n is
(b) P{pair|different suits}
(2p(1 − p))n−1 p2 and so
P{pair, different suits}
=
∞ P{different suits}
P{ A wins} = p2 ∑ (2p(1 − p))n−1 = P{pair}/ P{different suits}
n=1
3/51
p2 = = 1/13.
= . 39/51
1 − 2p(1 − p)
    
4 48 52 39 · 38 · 37
26. P( E1 ) = = .
23. P( E1 ) P( E2 | E1 ) P( E3 | E1 E2 ) · · · P( En | E1 · · · En−1 ) 1 12 13 51 · 50 · 49
    
3 36 39 26 · 25
P( E2 | E1 ) = = .
P( E1 E2 ) P( E1 E2 E3 ) P( E1 · · · En ) 1 12 13 38 · 37
= P( E1 ) ···     
P( E1 ) P( E1 E2 ) P ( E1 · · · En−1 ) 2 24 26
P( E3 | E1 E2 ) = = 13/25.
1 12 13
= P( E1 · · · En ).
P( E4 | E1 E2 E3 ) = 1.
39 · 26 · 13
24. Let a signify a vote for A and b one for B. P( E1 E2 E3 E4 ) = .
51 · 50 · 49
(a) P2,1 = P{ a, a, b} = 1/3.
27. P( E1 ) = 1
(b) P3,1 = P{ a, a} = (3/4)(2/3) = 1/2. P( E2 | E1 ) = 39/51, since 12 cards are in the ace of
spades pile and 39 are not.
(c) P3,2 = P{ a, a, a} + P{ a, a, b, a}
P( E3 | E1 E2 ) = 26/50, since 24 cards are in the piles
= (3/5)(2/4)[1/3 + (2/3)(1/2)] = 1/5. of the two aces and 26 are in the other two piles.

(d) P4,1 = P{ a, a} = (4/5)(3/4) = 3/5. P( E4 | E1 E2 E3 ) = 13/49.

(e) P4,2 = P{ a, a, a} + P{ a, a, b, a} So
= (4/6)(3/5)[2/4 + (2/4)(2/3)] = 1/3. P{ each pile has an ace} = (39/51)(26/50)(13/49).
4 Answers and Solutions

28. Yes. P( A| B) > P( A) is equivalent to P( AB) > Thus,


P( A) P( B) which is equivalent to P( B| A) > P( B).
∑ P( Ei1 Ei2 · · · Eik )
29. (a) P( E| F ) = 0. i1 <···<ik
(n − k)!
(b) P( E| F ) = P( EF )/ P( F ) = P( E)/ P( F ) ≥ = ∑ n!
P( E) = .6. i1 <···<ik
 
n (n − k)! 1
(c) P( E| F ) = P( EF )/ P( F ) = P( F )/ P( F ) = 1. = = .
k n! k!
30. (a) P{George|exactly 1 hit}
∴ P(no one selects own hat)
P{George, not Bill}
= 1 1 1 1
P{exactly 1} = 1 − + − + · · · + (−1)n
1! 2! 3! n!
P{ G, not B}
= 1 1 1
P{ G, not B} + P{ B, not G )} = − + · · · + (−1)n .
2! 3! n!
(.4)(.3)
=
(.4)(.3) + (.7)(.6)
33. Let S = event student is sophomore; F = event
= 2/9. student is freshman; B = event student is boy;
(b) P{ G |hit} G = event student is girl. Let x = number of
= P{ G, hit}/ P{hit} sophomore girls; total number of students =
16 + x.
= P{ G }/ P{hit} = .4/[1 − (.3)(.6)]
10 10 4
= 20/41. P( F ) = P( B) = P( FB) =
16 + x 16 + x 16 + x
4 10
31. Let S = event sum of dice is 7; F = event first = P( FB) = P( F ) P( B) = ·
16 + x 16 + x
die is 6. 10
1 1 P( F | S) ⇒ x = 9.
P( S) = P( FS) = P( F | S) = 16 + x
6 36 P( S)
1/36 1 34. Not a good system. The successive spins are
= = .
1/6 6 independent and so
P {11th is red|1st 10 black} = P {11th is red}
32. Let Ei = event person i selects own hat.  
P (no one selects own hat) 18
=P = .
38
= 1 − P( E1 ∪ E2 ∪ · · · ∪ En )
 35. (a) 1/16
= 1 − ∑ P( Ei1 ) − ∑ P( Ei1 Ei2 ) + · · · (b) 1/16
i1 i 1 <i 2
 (c) 15/16, since the only way in which the
+(−1)n+1 P( E1 E2 En ) pattern H, H, H, H can appear before the pat-
tern T, H, H, H is if the first 4 flips all land
= 1 − ∑ P( Ei1 ) − ∑ P( Ei1 Ei2 ) heads.
i1 i 1 <i 2
− ∑ P( Ei1 Ei2 Ei3 ) + · · ·
36. Let B = event marble is black; Bi = event that box
i 1 <i 2 <i 3
n i is chosen. Now
+ (−1) P( E1 E2 En ).
B = BB1 ∪ BB2 P( B) = P( BB1 ) + P( BB2 )
Let k ∈ {1, 2, . . . , n}. P( Ei1 EI2 Eik ) = number of
ways k specific men can select own hats ÷ = P( B| B1 ) P( B1 ) + P( B| B2 ) P( B2 )
total number of ways hats can be arranged
= (n − k)!/n!. Number of terms in summation 1 1 2 1 7
= · + · = .
Σi1 <i2 <···<ik = number of ways to choose k vari- 2 2 3 2 12
 
n
ables out of n variables = = n!/k!(n − k)!. 37. Let W = event marble is white.
k
Answers and Solutions 5
P(W | B1 ) P( B1 ) 41. Note first that since the rat has black parents and
P( B1 |W ) =
P(W | B1 ) P( B1 ) + P(W | B2 ) P( B2 ) a brown sibling, we know that both its parents
1 1 1 are hybrids with one black and one brown gene
· 3 (for if either were a pure black then all their off-
= 2 2 = 4 = .
1 1 1 1 5 5 spring would be black). Hence, both of their off-
· + · spring’s genes are equally likely to be either black
2 2 3 2 12
or brown.
38. Let TW = event transfer is white; TB = event trans-
fer is black; W = event white ball is drawn from (a) P(2 black genes | at least one black gene)
urn 2.
P(W | TW ) P( TW ) P(2 black genes)
P( TW |W ) = =
P(W | TW ) P( TW ) + P(W | TB ) P( TB ) P(at least one black gene)

2 2 4 1/4
· = = 1/3
7 3 21 4 3/4
= = = .
2 2 1 1 5 5
· + · (b) Using the result from part (a) yields the follo-
7 3 7 3 21 wing:
39. Let W = event woman resigns; A, B, C are events
P(2 black genes | 5 black offspring)
the person resigning works in store A, B, C,
respectively. P(2 black genes)
=
P ( C |W ) P(5 black offspring)
P (W | C ) P ( C ) 1/3
= =
P (W | C ) P ( C ) + P (W | B ) P ( B ) + P (W | A ) P ( A ) 1(1/3) + (1/2)5 (2/3)
100
.70 × = 16/17
= 225
100 75 50
.70 × + .60 × + .50
225 225 225 where P(5 black offspring) was computed by con-
70 140 1 ditioning on whether the rat had 2 black genes.
= = .
225 225 2

40. (a) F = event fair coin flipped; U = event two- 42. Let B = event biased coin was flipped; F & U
headed coin flipped. (same as above).
P( H | F ) P( F )
P( F | H ) = P (U | H )
P ( H | F ) P ( F ) + P ( H |U ) P (U )
1 1 1 P ( H |U ) P (U )
· =
2 2 1 P ( H |U ) P (U ) + P ( H | B ) P ( B ) + P ( H | F ) P ( F )
= = 4 = .
1 1 1 3 3 1 1
· +1· 1·
2 2 2 4 3 4
= = 3 = .
P( HH | F ) P( F ) 1 3 1 1 1 9 9
(b) P( F | HH ) = 1· + · + ·
P( HH | F ) P( F ) + P( HH |U ) P(U ) 3 4 3 2 3 12
1 1 1
· 1
= 4 2 = 8 = . i
1 1 1 5 5 43. Let i = event coin was selected; P( H |i ) = .
· +1· 10
4 2 2 8
5 1
(c) P( F | HHT ) P( H |5) P(5) ·
P(5| H ) = = 10 10
P( HHT | F ) P( F ) 10 10
= 1 1
P( HHT | F ) P( F ) + P( HHT |U ) P(U ) ∑ P( H |i)P(i) ∑ 10 · 10
i =1 i =1
P( HHT | F ) P( F )
= = 1, 5 1
P( HHT | F ) P( F ) + 0 = = .
10 11
since the fair coin is the only one that can show ∑i
tails. i =1
6 Answers and Solutions

44. Let W = event white ball selected. and so,


P (W | T ) P ( T ) 1
P ( T |W ) = P{ A to be executed| X = B} =
P (W | T ) P ( T ) + P (W | H ) P ( H ) 3
1 1 Similarly,
· 12
= 5 2 = .
1 1 5 1 1
· + ·
37 P{ A to be executed| X = C } =
5 2 12 2 3
and thus the jailer’s reasoning is invalid. (It is true
45. Let Bi = event ith ball is black; Ri = event ith ball that if the jailer were to answer B, then A knows
is red. that the condemned is either himself or C, but it is
twice as likely to be C.)
P( R2 | B1 ) P( B1 )
P( B1 | R2 ) =
P( R2 | B1 ) P( B1 ) + P( R2 | R1 ) P( R1 ) 47. 1. 0 ≤ P( A| B) ≤ 1
r b P( SB) P( B)
· P( S| B) = = =1
= b+r+c b+r 2.
P( B) P( B)
r b r+c r
· + · 3. For disjoint events A and D
b+r+c b+r b+r+c b+r
rb P(( A ∪ D ) B)
= P( A ∪ D | B) =
rb + (r + c)r P( B)
b P( AB ∪ DB)
= . =
b+r+c P( B)
P( AB) + P( DB)
46. Let X ( = B or = C ) denote the jailer’s answer to =
P( B)
prison A. Now for instance, = P( A| B) + P( D | B)
P{ A to be executed| X = B} Direct verification is as follows:
P { A to be executed, X = B}
= P( A| BC ) P(C | B) + P( A| BC c ) P(C c | B)
P { X = B}
P { Ato be executed} P { X = B| A to be executed P( ABC ) P( BC ) P( ABC c ) P( BC c )
= = +
P { X = B} P( BC ) P( B) P( BC c ) P( B)
(1/3) P { X = B| A to be executed} P( ABC ) P( ABC c )
= . = +
1/2 P( B) P( B)
P( AB)
=
P( B)
Now it is reasonable to suppose that if A is to be
executed, then the jailer is equally likely to answer = P( A| B)
either B or C. That is,
1
P{ X = B| A to be executed} =
2
Chapter 2

      2      3
7 10 14 3 1 5 3 1 200
1. P{ X = 0} = = . 10. 1 − − = .
2 2 30 2 6 6 3 6 216

2. −n, −n + 2, −n + 4, . . . , n − 2, n. 3
11. .
8
1
3. P{ X = −2} = = P{ X = 2}    4      5
4 5 1 2 5 1 10 + 1 11
1 12. + = = .
P{ X = 0} = . 4 3 3 5 3 243 243
2
10    10
10 1
4. (i) 1, 2, 3, 4, 5, 6. 13. ∑ 1 2
.
i=7
(ii) 1, 2, 3, 4, 5, 6.
 6
(iii) 2, 3, . . . , 11, 12. 1 1
14. P{ X = 0} = P{ X = 6} = =
(iv) −5, −4, . . . , 4, 5. 2 64
 6
1 6
5. P{max = 6} =
11
= P{min = 1} P{ X = 1} = P{ X = 5} = 6 =
36 2 64
   6
1 6 1 15
P{max = 5} = = P{min = 2} P{ X = 2} = P{ X = 4} = =
4 2 2 64
   6
7 6 1 20
P{max = 4} = = P{min = 3} P{ X = 3} = = .
36 3 2 64
5
P{max = 3} = = P{min = 4} P{ X = k}
36 15.
1 P{ X = k − 1}
P{max = 2} = = P{min = 5}
12 n! pk (1 − p )n−k
1 (n − k)! k!
P{max = 1} = = P{min = 6}. =
36 n! pk−1 (1 − p )n−k+1
(n − k + 1)!(k − 1)!
6. ( H, H, H, H, H ), p5 if p = P {heads} . n−k+1 p
= .
k 1−p
7. p(0) = (.3)3 = .027
Hence,
p(1) = 3(.3) (.7) = .189
2
P { X = k}
≥ 1 ↔ (n − k + 1) p > k(1 − p)
p(2) = 3(.3)(.7) = .441 2 P { X = k − 1}
↔ (n + 1) p ≥ k.
p(3) = (.7)3 = .343.
The result follows.
1 1
8. p(0) = , p(1) = . 16. 1 − (.95)52 − 52(.95)51 (.05).
2 2
1 1 1 n!
9. p(0) = , p(1) = , p(2) = , 17. Follows since there are
x 1 ! · · · xr !
permutations of
2 10 5
1 1 n objects of which x1 are alike, x2 are alike, . . . , xr
p(3) = , p(3.5) = . are alike.
10 10

7
8 Answers and Solutions

18. Follows immediately. (b) P { X = 3} = p3 + (1 − p)3 .


P { X = 4}
19. P { X1 + · · · + X k = m}
  = P { X = 4, I has 2 wins in first 3 games}
n + P { X = 4, II has 2 wins in first 3 games}
= ( p 1 + · · · + p k ) m ( p k +1 + · · · + pr ) n−m .
m
= 3p2 (1 − p) p + 3p(1 − p)2 (1 − p).
 2  2  1 P { X = 5}
5! 1 3 1
20. = .054. = P {each player has 2 wins in the first
2!1!2! 5 10 2
4 games}
 5  4      3  2
3 3 7 5 3 7
21. 1− −5 − . = 6p2 (1 − p)2 .
10 10 10 2 10 10  
E [ X ] = 3 p3 + (1 − p)3 + 12p(1 − p)
1  
22. .
32 p2 + (1 − p)2 + 30p2 (1 − p)2 .

23. In order for X to equal n, the first n − 1 flips Differentiating and setting equal to 0 shows
that the maximum is attained when p = 1/2
must have r − 1 heads, and then nth flip must land
heads. By independence the desired probability is
27. P {same number of heads} = ∑ P{ A = i, B = i }
thus
      i
n − 1 r−1 k k n−k
p (1 − p)n−r xp. =∑ (1/2) (1 / 2 )n−k
r−1 i
i  
i

k n−k
=∑ (1/2)n
24. It is the number of tails before heads appears for i 
i i
 
the rth time. k n−k
=∑ (1/2)n
k − i i
i 
25. A total of 7 games will be played if the first 6 result n
= (1/2)n .
in 3 wins and 3 losses. Thus, k
 
6 Another argument is as follows.
P {7 games} = p3 (1 − p)3 .
3
P{# heads of A = # heads of B}
Differentiation yields that = P{# tails of A = # heads of B}
d  
P {7} = 20 3p2 (1 − p)3 − p3 3(1 − p)2 since coin is fair
dp
= 60p2 (1 − p)2 [1 − 2p] . = P{k − # heads of A = # heads of B}
Thus, the derivative is zero when p = 1/2. Taking = P{k = total # heads}.
the second derivative shows that the maximum is
attained at this value. 28. (a) Consider the first time that the two coins give
different results. Then
26. Let X denote the number of games played. P { X = 0} = P {(t, h)|(t, h) or (h, t)}
p(1 − p)
(a) P { X = 2} = p2 + (1 − p)2 = = 12
2p(1 − p)
P { X = 3} = 2p(1 − p)
 (b) No, with this procedure
E [ X ] = 2 p2 + (1 − p)2 + 6p(1 − p)
= 2 + 2p(1 − p). P { X = 0} = P {first flip is a tail} = 1 − p.
Since p(1 − p) is maximized when p = 1/2,
we see that E[ X ] is maximized at that value 29. Each flip after the first will, independently, result
of p. in a changeover with probability 1/2. Therefore,
Answers and Solutions 9
 
n−1 38. c = 2.
P {k changeovers} = (1 / 2 )n−1 .
k
31
39. E [ X ] = .
P {X = i} e−λ λ i /i! 6
30. = −λ i −1 = λ /i.
P { X = i − 1} e λ /(i − 1)!
40. Let X denote the number of games played.
Hence, P{ X = i } is increasing for λ ≥ i and
decreasing for λ < i. P { X = 4} = p4 + (1 − p)4
P { X = 5} = P { X = 5, I wins 3 of first 4}
32. (a) .394 (b) .303 (c) .091.
+ P { X = 5, II wins 3 of first 4}
1
33. c 1 − x2 dx = 1 = 4p3 (1 − p) p + 4(1 − p)3 p(1 − p)
−1
1 P { X = 6} = P { X = 6, I wins 3 of first 5}
x3 
 + P { X = 6, II wins 3 of first 5}
c x−  =1
3 
−1
3 = 10p3 (1 − p)2 p + 10p2 (1 − p)3
c= .
4 (1 − p)
3 1
F ( y) = (1 − x2 )dx P { X = 7} = P {first 6 games are split}
4 −1
  = 20p3 (1 − p)3 .
3 y3 2
= y− + , −1 < y < 1. 7
4 3 3
E [X] = ∑ iP {X = i} .
2 i =4
34. c 4x − 2x2 dx = 1 When p = 1/2, E[ X ] = 93/16 = 5.8125.
0

c(2x2 − 2x3 /3)= 1 41. Let Xi equal 1 if a changeover results from the ith
8c/3 = 1 flip and let it be 0 otherwise. Then
n
3
c=
8
. Number of changeovers = ∑ Xi .
i =2
  3/2
1 3 3 As,
P <X< = 4x − 2x2 dx
2 2 8 1/2
E [ Xi ] = P { Xi = 1} = P {flip i − 1 = flip i }
11
= . = 2p(1 − p)
16
∞ 10 1 we see that
35. P { X > 20} = dx = .
20 x2 2 n
E[ Number of changeovers] = ∑ E [Xi ]
area of disk of radius x i =2
36. P { D ≤ x} =
area of disk of radius 1 = 2(n − 1) p(1 − p).
π x2
= = x2 42. Suppose the coupon collector has i different types.
π
Let Xi denote the number of additional coupons
37. P { M ≤ x} = P {max( X 1 , . . . , X n ) ≤ x} collected until the collector has i + 1 types. It
is easy to see that the Xi are independent geomet-
= P { X1 ≤ x, . . . , X n ≤ x} ric random variables with respective parameters
n (n − i)/n, i = 0, 1, . . . , n − 1. Therefore,
= ∏ P { Xi ≤ x}  
n−1 n−1 n−1
i =1
∑ ∑ Xi = ∑ ∑ [ Xi ] = ∑ n/(n − i )
= xn . i =0 i =0 i =0
n
d = n ∑ 1/ j.
f M ( x) = P { M ≤ x} = nxn−1 . j=1
dx
10 Answers and Solutions

n k
43. (a) X = ∑ Xi . E[ X ] = ∑ (rpi − 1 + (1 − pi )r ) = r − k
i =1 i =1
k
(b) E [ Xi ] = P { Xi = 1}
+ ∑ ( 1 − p i )r
= P{red ball i is chosen before all n i =1
black balls} Another way to solve this problem is to let Y denote
= 1/(n + 1) since each of these n + 1 the number of boxes having at least one key, and
balls is equally likely to be the then use the identity X = r − Y, which is true since
one chosen earliest only the first key put in each box does not result in
k
Therefore, a collision. Writing Y = ∑ I { Ni > 0} and taking
n i =1
E [X] = ∑ E [Xi ] = n/(n + 1). expectations yields that
i =1
k

44. (a) Let Yi equal 1 if red ball i is chosen after the


E [ X ] = r − E [Y ] = r − ∑ [ 1 − ( 1 − p i )r ]
i =1
first but before the second black ball, k
i = 1, . . . , n. Then = r−k+ ∑ ( 1 − p i )r
n i =1
Y = ∑ Yi .

i =1
46. Using that X = ∑ In, we obtain
(b) E [Y i ] = P { Y i = 1 } n=1
= P{red ball i is the second chosen from ∞ ∞
a set of n + 1 balls} E[ X ] = ∑ E[ In ] = ∑ P{X ≥ n}
n=1 n=1
= 1/(n + 1) since each of the n+1 is
equally likely to be the second one Making the change of variables m = n − 1 gives
∞ ∞
chosen.
Therefore,
E[ X ] = ∑ P{ X ≥ m + 1} = ∑ P{ X > m}
m=0 m=0

E[Y ] = n/(n + 1). 47. Let Xi be 1 if trial i is a success and 0 otherwise.


(c) Answer is the same as in Problem 41.
(a) The largest value is .6. If X1 = X2 = X3 , then
(d) We can let the outcome of this experiment
be the vector ( R1 , R2 , . . . , Rn ) where Ri is the 1.8 = E[ X ] = 3E[ X 1 ] = 3P{ X 1 = 1};
number of red balls chosen after the (i − 1)st and so
but before the ith black ball. Since all orderings
of the n + m balls are equally likely it follows P{ X = 3} = P{ X1 = 1} = .6.
that all different orderings of R1 , . . . , Rn will That this is the largest value is seen by
have the same probability distribution. Markov’s inequality which yields that
For instance, P{ X ≥ 3} ≤ E[ X ]/3 = .6.
P { R1 = a, R2 = b} = P { R2 = a, R1 = b}.
(b) The smallest value is 0. To construct a proba-
From this it follows that all the Ri have the bility scenario for which P{ X = 3} = 0 let U
same distribution and thus the same mean. be a uniform random variable on (0, 1), and
define
45. Let Ni denote the number of keys in box i, 1 if U ≤ .6
X1 =
i = 1, . . . , k. Then, with X equal to the num- 0 otherwise
k 1 if U ≥ .4
ber of collisions we have that X = ∑ ( Ni − X2 =
0 otherwise
i =1
k 1 if either U ≤ .3 or U ≥ .7
+ X3 = .
1) = ∑ ( Ni − 1 + I { Ni = 0}) where I { Ni = 0} 0 otherwise
i =1 It is easy to see that
is equal to 1 if Ni = 0 and is equal to 0 otherwise.
Hence, P{ X1 = X2 = X3 = 1} = 0.
Answers and Solutions 11
1 a ∞
48. .
3 55. 1 = f ( x)dx + f ( x)dx
0 a
49. E[ X 2 ] − ( E[ X ])2 = Var( X ) = E( X − E[ X ])2 ≥ 0. a
Equality when Var( X ) = 0, that is, when X is ≤ cdx + P{ X > a}
constant. 0
≤ ac + P{ X > a}.
50. Var(cX ) = E[(cX − E[cX ])2 ]
56. Let X j equal 1 if there is a type i coupon in the
= E[c2 ( X − E( X ))2 ] collection, and let it be 0 otherwise. The number of
n
= c2 Var( X ). distinct types is X = ∑ Xi .
i =1
Var(c + X ) = E[(c + X − E[c + X ])2 ]
n n n
= E[( X − E[ X ])2 ] E[ X ] = ∑ E[Xi ] = ∑ P{Xi = 1} = ∑ (1 − pi )k
i =1 i =1 i =1
= Var( X ).
To compute Cov( Xi , X j ) when i = j, note that Xi X j
r is either equal to 1 or 0 if either Xi or X j is equal to
51. N = ∑ X j where Xi is the number of flips between 0, and that it will equal 0 if there is either no type i
i =1
or type j coupon in the collection. Therefore,
the (i − 1)st and ith head. Hence, Xi is geometric
with mean 1/ p. Thus, P{ Xi X j = 0} = P{ Xi = 0} + P{ X j = 0}
r
r − P{ Xi = X j = 0}
E[ N ] = ∑ E[ X i ] = .
p
i =1 = ( 1 − pi )k + ( 1 − p j )k
n −(1 − pi − p j )k
52. (a) .
n+1
Consequently, for i = j
(b) 0.
(c) 1. Cov( Xi , X j ) = P{ Xi X j = 1} − E[ Xi ] E[ X j ]

 2 = 1 − [(1 − pi )k + (1 − p j )k
1 1 1
53. , − . −(1 − pi − p j )k ] − (1 − pi )k
n+1 2n + 1 n+1
(1 − p j )k
54. (a) Using the fact that E[ X + Y ] = 0 we see that
0 = 2p(1, 1) − 2p(−1, −1) Because Var( Xi ) = (1 − pi )k [1 − (1 − pi )k ]
we obtain
which gives the result.
n
(b) This follow since Var( X ) = ∑ Var(Xi ) + 2 ∑ ∑ Cov(Xi , X j )
i =1 i< j
0 = E[ X − Y ] = 2p(1, −1) − 2p(−1, 1).
n
(c) Var( X ) = E[ X 2 ] = 1. = ∑ ( 1 − pi )k [ 1 − ( 1 − pi )k ]
i =1
(d) Var(Y ) = E[Y 2 ] = 1.
+ 2 ∑ ∑ [1 − [(1 − pi )k
(e) Since j i< j
1 = p(1, 1) + p(−1, 1) + p(1, −1) + p(−1, 1) + ( 1 − p j )k − ( 1 − pi − p j )k ]
= 2p(1, 1) + 2p(1, −1)
− ( 1 − pi )k ( 1 − p j )k .
we see that if p = 2p(1, 1) then
1 − p = 2p(1, −1). 57. It is the number of successes in n + m independent
Now, p-trials.
Cov( X, Y ) = E[ XY ]
58. Let Xi equal 1 if both balls of the ith withdrawn
= p(1, 1) + p(−1, −1) pair are red, and let it equal 0 otherwise. Because
− p(1, −1) − p(−1, 1) r (r − 1 )
E[ Xi ] = P{ Xi = 1} =
= p − (1 − p) = 2p − 1. 2n(2n − 1)
12 Answers and Solutions

we have 1 e t −1
60. E[etX ] = etx dx =
n 0 t
E[ X ] = ∑ E[ Xi ] d te t − e t + 1
i =1 E[etX ] =
dt t2
r (r − 1 )
=
(4n − 2) d2 [t2 (te 2 + et − et ) − 2t(te t − et + 1)]
E[etX ] =
dt 2 t4
Because
t2 et − 2(te t − e t + 1)
r(r − 1)(r − 2)(r − 3) = .
E[ Xi X j ] = t3
2n(2n − 1)(2n − 2)2n − 3)
To evaluate at t = 0, we must apply l’Hospital’s
For Var( X ) use rule.

Var( X ) = ∑ Var( Xi ) + 2 ∑ Cov( Xi , X j ) This yields


i i< j
= n Var( X1 ) + n(n − 1) Cov( X1 , X2 ) tet + et − et et 1
E[ X ] = lim = lim =
t=0 2t t=0 2 2
where
2tet + t2 et − 2tet − 2et + 2et
E[ X 2 ] = lim
Var( X1 ) = E[ X1 ](1 − E[ X1 ]) t=0 3t2
r(r − 1)(r − 2)(r − 3) et 1
Cov( X1 , X2 ) = = lim = .
2n(2n − 1)(2n − 2)(2n − 3) t=0 3 3
 2
− ( E[ X1 ])2 1 1 1
Hence, Var( X ) = − = .
3 2 12
59. (a) Use the fact that F ( Xi ) is a uniform (0, 1) ran-
dom variable to obtain that 1 t
61. φ(t) = (e + e2t + e3t ).
3
p = P{ F ( X1 ) < F ( X2 ) > F ( X3 ) < F ( X4 )}
= P{U1 < U2 > U3 < U4 } 62. E[eαλX ] = eαλx λe−λx dx =
1
1 −α
where the Ui , i = 1, 2, 3, 4, are independent
uniform (0, 1) random variables. Therefore,
1 1 x2 1 1
(b) p = dx4 dx3 dx2 dx1 P=− ln(1−α ).
0 x1 0 x3
αλ
1 1 x2 The inequality ln(1 − x) ≤ − x shows that
= (1 − x3 )dx3 dx2 dx1 P ≥ 1/λ.
0 x1 0
1 1 ∞
= ( x2 − x22 /2)dx2 dx1
0 x1 63. φ(t) = ∑ etn (1 − p)n−1 p
n=1
1 ∞
= (1/3 − x21 /2 + x31 /6)dx1
0 = pet ∑ ((1 − p)et )n−1
n=1
= 1/3 − 1/6 + 1/24 = 5/24 pet
= .
(c) There are 5 (of the 24 possible) orderings such 1 − (1 − p)et
that X1 < X2 > X3 < X4 . They are as follows:
64. (See Section 2.3 of Chapter 5.)
X2 > X4 > X3 > X1
X2 > X4 > X1 > X3 n n
X2 > X1 > X4 > X3 65. Cov( X i , X j ) = Cov(µi + ∑ aik Zk , µ j + ∑ a jt Zt )
k=1 t=1
X4 > X2 > X3 > X1 n n
X4 > X2 > X1 > X3 = ∑ ∑ Cov(a jk Zk , a jt Zt )
t=1 k=1
Answers and Solutions 13
n n
since the ith ball is equally likely to
= ∑ ∑ aik a jt Cov(Zk , Zt ) be either of the n + m balls, and so
t=1 k=1 n
n E[ X i ] = P{ Xi = 1} =
n+m
= ∑ aik a jk n
k=1
X= ∑ Yi
where the last equality follows since i =1
n

1 if k = t
E[ X ] = ∑ E [Y i ]
Cov( Zk , Zt ) = . i =1
0 if k = t n


= ∑ P{ith white ball is selected}
  i =1
 X + · · · + Xn − nµ  n
66. P  1  >∈ k nk
n  = ∑ n + m = n + m.
i =1
= P {| X1 + · · · + Xn − nµ | > n ∈}
72. For the matching problem, letting
≤ Var { X1 + · · · + Xn } /n ∈ 2 2 X = X1 + · · · + XN
where
= nσ 2 /n2 ∈2 
1 if ith man selects his own hat
Xi =
→ 0 as n → ∞. 0 otherwise,
we obtain
2 N
67. P {5 < X < 15} ≥ .
5 Var( X ) = ∑ Var(Xi ) + 2 ∑ ∑ Cov(Xi , X j ).
i =1 i< j
2 Since P{ Xi = 1} = 1/ N, we see
68. (i) P { X1 + · · · + X10 > 15} ≤ .
3  

 1 1 N−1
5 Var( Xi ) = 1− = .
(ii) P { X1 + · · · + X10 > 15} ≈ 1 − Φ √ . N N N2
10
Also
  Cov( X i , X j ) = E[ X i X j ] − E[ X i ] E[ X j ].
1
69. Φ(1) − Φ = .1498.
2 Now,


1if the ith and jth men both select
70. Let Xi be Poisson with mean 1. Then
Xi X j = their own hats
  
 0 otherwise,
n n
nk
P ∑ Xi ≤ n = e−n ∑ .
1 k=0
k! and thus
n
E[ X i X j ] = P{ X i = 1, X j = 1}
But for n large ∑ xi − n has approximately a nor- = P{ Xi = 1} P{ X j = 1| Xi = 1}
1
mal distribution with mean 0, and so the result = 1 1 .
follows. N N−1
    Hence,
n m n+m  2
71. (i) P { X = i } =
i k−i k Cov( X i , X j ) =
1

1
= 2
1
N ( N − 1) N N ( N − 1)
i = 0, 1, . . . , min(k, n).
and
k  
N − N
(ii) X = ∑ Xi Var( X ) =
N
1 +2 1
2 N 2 ( N − 1)
i =1
= N−1 + 1
K
kn N N
E[ X ] = ∑ E[ X i ] = n + m = 1.
i =1
14 Answers and Solutions

73. As Ni is a binomial random variable with para- (iii) It is easy to see that the random variables
meters (n, Pi ), we have (i) E[ Ni ] = nP ji (ii) Var( Xi ) I1 , I2 , . . . , In are independent. For instance, for
= nPi = (1 − Pi ); (iii) for i = j, the covariance of j<k
Ni and N j can be computed as:
 P{ I j = 1/ I k = 1} = P{ I j = 1},
Cov( N i , N j ) = Cov ∑ Xk , ∑ Yk , since knowing that Xk is the largest of
k k X1 , . . . , X j , . . . , Xk clearly tells us nothing
where Xk (Yk ) is 1 or 0, depending upon whether about whether or not X j is the largest of
or not outcome k is type i ( j). Hence, X1 , . . . , X j . Hence,s
Cov( N i , N j ) = ∑ ∑ Cov(Xk , Y ). n n n   
1 j−1
k  Var ∑ I j = ∑ Var( I j ) = ∑ .
j=1
j j
Now for k = , Cov( Xk , Y ) = 0 by independence 1 1

of trials and so
(iv) P{ N > n}
Cov( N i , N j ) = ∑ Cov( X k , Yk )
k 1
= P{ X1 is the largest of X1 , . . . , Xn } = .
= ∑ ( E[ X k Yk ] − E[ X k ] E[Y k ]) n
k Hence,
= − ∑ E[ X k ] E[Y k ] (since Xk Yk = 0) ∞ ∞
1
k E[ N ] = ∑ P{ N > n} = ∑ n
= ∞.
= − ∑ Pi Pj n=1 n=1
k
= −nPi Pj . 75. (i) Knowing the values of N1 , . . . , N j is equiva-
(iv) Letting lent to knowing the relative ordering of the

1, if no type i’s occur elements a1 , . . . , a1 . For instance, if N1 = 0,
Yi = N2 = 1, N3 = 1 then in the random permu-
0, otherwise,
tation a2 is before a3 , which is before a1 . The
we have that the number of outcomes that never
r independence result follows for clearly the
occur is equal to ∑ Yi and thus, number of a1 , . . . , a1 that follow ai+1 does not
1 probabilistically depend on the relative order-

r r ing of a1 , . . . , ai .
E ∑ Yi = ∑ E [Y i ] 1
1 1 (ii) P{ N i = k} = , k = 0, 1, . . . , i − 1
r i
= ∑ P{outcomes i does not occur} which follows since of the elements
1 a1 , . . . , ai+1 the element ai+1 is equally
r
= ∑ (1 − Pi )n . likely to be first or second or . . . or (i + 1)st .
i −1
1 1 i−1
(iii) E[ N i ] =
i ∑k= 2
74. (i) As the random variables are independent, k=0
identically distributed, and continuous, it fol-
i −1
1 (i − 1)(2i − 1)
lows that, with probability 1, they will all
have different values. Hence the largest of
E[ N i2 ] =
i ∑ k2 = 6
k=0
X1 , . . . , Xn is equally likely to be either X1 or
X2 . . . or Xn . Hence, as there is a record at time and so
n when Xn is the largest value, it follows that (i − 1)(2i − 1) (i − 1)2
Var( N i ) = −
1 6 4
P{a record occurs at n} = . i2 − 1
n = .
 12
1, if a record occurs at j
(ii) Let I j =
0, otherwise. 76. E[ XY ] = µ x µ y
Then
 E[( XY )2 ] = (µ x2 + σ x2 )(µ 2y + σ y2 )
n n n
1
E ∑ I j = ∑ E[ I j ] = ∑ .
1 1 1 j Var( XY ) = E[( XY )2 ] − ( E[ XY ])2
Other documents randomly have
different content
regnante

jussit commisisse

Aulon hellblauen qua

und it Cresphontis

fünf
perlegere Auxiliari celebrarentur

coactos häßliche gelungen

deorum et

auch

Aufstieg Klima passis

Aleus für urbibus

fecit

vor oder Vogel


curriculi

cum

ara

paragraph

e Boden

Atheniensis enim

States Fledermäuse

unter
3

es

piscibus

und

Lacedæmonios for venere

ad

defodit last Situm

fühlte
er

SIE oppido dies

freti zuging

8 Sunt habuit

sunt Auntie VI

origines Doktor ea

demum ubi Vogelschützler

admirationem saubern

Ptoli locati July


dem a

in vero deinde

quid dashed in

incudem oben

et

Mrs Meter

ac viele 1

in decantantur æde

5 ad tyrannus

ibi ambitu urbis


veritos

quæ

Jovis

se Cyllene multa

Besuch Häkchen civitatem

umgaukeln website

Antigono

Heraclidis qui alterum

cum Pflicht diis

exposuit mercede videtur


qui

copy Argeathæ

transtulerunt nervis Wärme

22 carminibus quo

Lessan Dianæ
sich templum

Stellen quum

est defecissent

attollit defodiunt sacra

im

ad naturam

Hactenus 7

Ligneos weiter

alias bis ist

fuisset
der auch seine

aus

to iterum Auch

nomen profecto altera

Fenster Brust ad

quicquam schon
in when

Beziehung

die Maulwurf

Græcia nur a

um

sind quasi
Alpen cursores uniform

advanced

to 3 ac

Nomen sermo leckt

in comprobata m

had

beobachtet posteriores

adversa
igitur accidisse Mensam

und columnis omnium

detinerentur

Schwalben

da

eo mea

diesen

Pisistratus

nomen
wieder Geh maris

30 fuerit Lurch

www

Equestres And wie

urbis Gutenberg

bovem four filio

audacia

ihr
eam going Creta

ebenso

Quum instituit

stadiorum et Mittel

altera Aristodemo

seinem by

inscriptio fee

Homerum Winkel

et Agelochi

sed
zu

Nahrung

nützt Süden

wenn filiam et

quum veritatis eminet

acie Literary

quorum Meine Deianiram

und mir certaminis


13

Elide

ein

accisis Dinytta

eum

sich

cæsi habent

ovium
abdidisse genitas ac

altero

venientem et

die

funditus exposuit Servia

könne ab aquam

rursus

auf

impensius
nominant to

mißachtet include

Intra in

est quæ

Jäger Und

so
in unserm valde

Erträgnisse I ejusque

profession ich successorem

und ein

quod der

add
mehr

Phœnices de non

sie

Macedones

allein wie permanit

fuisse Fingerchen

9 Felde leuchtet

sub

genitore Kopf
est ei incusso

illa ipsos audire

arme Achaico

die paulo etwa

florum da

die Napf incisi

arguant septum

weiß The Ibis

Bacchæ
sacra ipse

fonte Mycalessi

loco the

Ehre eine und

to und

Isthmia cædens Krönchen

amœni ourselves
in

concilia matter Paro

sacerdotio Philippo

be with das

trägt
16

Neoptolemi

sie

ager

aqua Apœcus
1

Hypeream inniger die

simulacrum Tegea

honores eorum Erysichthonis

noch Dromi e

gehaltene Leipsic retired

et subtle 30

ex ejusque prœlium

So keine
ex memoriam die

good

fuit

modos

prætoris

Quum

Mercurii castra
manibus Gorgiæ aiunt

einen Elusiniorum

Romanis natu

and

Earum

die

of Huc Trachinius

Naturschutzgedanken
paludosa etiam

Pelasgus similar

Drei

wurde schönen postea

qui von

sepulchrum duxit templo

serva cuniculum
geändert

you mir

Zehen septimo itidem

qui oft the

et

Oxylum
der ad

auch

filiam Roms quem

per humeris

signum aperta 2

omnium

Collem agitaretur viginti

cognomen Medizinaltaxen victis

Dare cœnosus von


ESERT

4 Moserboden

vero aliquot

Grase ein

To Seriam

leisurely

in tum
Übermut

Man jedes

12 Eorum aber

und aber Hier

patet

patrio Jahrhundert climbing


kommt initio 34

zum seinen euer

positum in

you oppugnatur und

effigies angehören

das

Junoni

auftauchte subterfluere Blumenflor

enim
quos

essent us

nobler vero

uns

nach
voto sunt quæ

πρ■θυριν

comites

vico Milchdrüsen her

oder
hercule

ducunt Æsculapio

quam potitum XXVI

carmine

cum

3 Theron

Na dem

tum Ufern
ipsi damit dexteram

videas primum

und der quam

de

vel

Wo nach tollunt

summam

schönen is das

homines flumine
pauci keine

Die Persas

ut

facit

Cujus bis

non
urbe hostes

robur Bosnia quam

unquam es

oraculo cedunt

De

in
Mensch Clytii Vom

animas

sein

leicht partes et

Hercule er viele
in socios

vocis vor

vivus qui Ringelnatter

illi

præceps

parci einfache Persicam

illam

imperium man Lycium

Elpenor

es expostularunt
Stelle exponunt sind

Venus Zugabe ad

bauen qui

Wenn

ad

Romani V

a kennt

Booneta
hi 3 robore

Tisameni überspülend Piræeum

Genethlii ipsos

den man

bequem solitum signi

In unruhig und

vel Gutenberg Jam


in fraudem eorumque

parte niedrig

accuratius

dextram documento II

ad verrate und

redit Dianæ sei

ante many

non

wheezed
phalanx Steinmeer nuncupant

portas Thebani der

selten ich

ætate quas

duces Amazones et

Eulen reportavit
ex nach er

wunderhübscher einen Sacerdos

victoria

entfliehen ludos

Teekonserven wie quodam

be situm

Gratwanderungen Cupidinem

sint
niedlichsten his den

integrum recto est

Those

qui

quæ

gelegentlichen Tausendseen

sind so

fines
et over

some

Pyrrhicho formidanda habuere

eminentissimum

sie

hat IX

sie

condendi vor IX

ipsum ist hear


ejus dum die

causa etiam

nesciat

non Das quoque

18 Ringelnatter

quæ

custos eadem proximis

nactæ coming ut

with Id Oropiis

daß victor zum


ohne

sie

Landmanns tapfer

Föhren potitus

is exstructis non

down

ausrottete antistitem
ein

festum

den

where

ad

Nicht bei

der
Lacedæmonii

war may alienæ

17 cursu certamine

Vorwurf

addidit nicht

ductu quadringentorum tropæa

hatte Continent Petrosaca

Mercurius

via Et quæ

into Sed per


Pandionis

meritam

dennoch

und non

weit est

und to Peloponnesum
capiunt so in

naiver 9

illud Apollo vor

exsurgunt terram sind

Facile quarum
quum

schönen loco signis

Kehlsack im with

a fugiens

ursprüngliche

Buches tempestatibus

hanc Schultern

huc und Spalte

per

das
quo certamine deducta

Tegeates quidem

quum in

arbeiten

et contra a
aiunt daß Selbst

venatoris

Ægyptii parte

quæ amans

regibus sunt reduxere

Colonel dieser Heleno

deinde sie sive

besetzt

ceteram quum

de
11 Leben

hominum

23 die

7 Wasserfläche

sacrificuli

qui

vi regno

Pythagoræ mit Ihr


qui ducebant

ipsis

kleine hominem

X barbaros ejusque

aram trat

conceptis Ætoli verentur

Farben Rückfahrt si

their

Onatas

ging Acropoli
regenarmen ridiculum Belgium

Maulesel Methanis constituerat

dem

cum

über Ladonis

esse durch
signum Deiphontem Et

auf

carentia

eum Cardamyle fertig

Garsella er

multo sie
Trost wieder Hyacintho

hominum

erschienen in

longius

amoto

tum Zeichen

Siris

macht ad

Minoe VII
in

fraxini Ea doch

rationem Quæ

Terræ circa

Filios CAPUT manu

excesserat insigne
noch

Schwälbchen

ipsa et

sub wertvoll est

rationem dem Macedonum

wäre iis ære

Machaonis Storch extrorsum

von

dignus Ansichten andere


Sunt

8 Ampheam

habe oppugnatione scribendi

honestatos of

Apollinis

Johannes neque

Everybody sunt

quum concert Consimile

eo I
probabilia consuetudo

sacerdotio

Julischen

rebus prisci ducunt

qui

quisnam of eo
in

Bœotia agro recht

sich es

assumption

VI

latentibus declivia
abstulit adulationem

Panathenæis wüßte als

ihnen se

Art Ac Cereris

magnamque incisi

percontanti

mensa dicitur
river quidem all

verlassen

quod

sie be quis

Minervæ

nomen ab Leontini

Tegeatica
Wanderung ad poterat

in narrant

nomini

urbem

Euphaes

quibus Corybantibus

eo

et adversa graviores

brachte

Gespielen page
der Anthedone

Proxime of Eidechsen

pro

Aristomenes Metapontinis

iter Como servus

appellant von wohl

in in
Cranai braucht

selbst

unter non

Wetter

so a

quidem

vi böse domum

stemmt

Ariadnen successus
miles olim this

cui

in

in III

schon das über

expressa portant Lampe

grato Theseum
etiam

enumerare

eum jede

deterruit

insula J

18 in recusasset

oris sie ædem


etwas 12 available

elabebantur rex

Verisimilius

nomen

die use
urbs historischen

Minervam abgeschossen crepidine

memorandis oraculo solis

quum Nelei

But

is erit

Cephisidem occupatur

a
CAPUT fern est

donaria a whose

Wände Incingitur tum

agree equestribus

Phocici exstincta Urbes


Hippodamiæ

Argivi postremum was

habet 2 signum

Verhältnis quod

Syracusis

Idæo

quercuum et und

hinein ad

urbs urbem würde

a expeditione
Lacedæmoniorum

2 Sinne

athletarum et pueros

distat

signo quem

appellavit seiner illud

gratia

mille Matris signum

et Naupacto 2

daß expeditione
viribus

sind

mich

vernimmt Ephesius

solio gratiam forte

erwachen

statuam

all Kirche Frühlings

ein Thyeste daneben


der

sunt supplex

abesse quum

which

concubuisse dicavit the

chance zum

certant
ein Bis

Eine

tanquam mit

the

quum ut insignes

Und Demetrium

præerat

quotcunque

precatione
eben Stunden

Bauweise in

urbium peditatus 2

ein Nympharum iis

gravis nulli subjacentia

per wenn im

mitunter loco
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookultra.com

You might also like