




























































































Estude fácil! Tem muito documento disponível na Docsity
Ganhe pontos ajudando outros esrudantes ou compre um plano Premium
Prepare-se para as provas
Estude fácil! Tem muito documento disponível na Docsity
Prepare-se para as provas com trabalhos de outros alunos como você, aqui na Docsity
Os melhores documentos à venda: Trabalhos de alunos formados
Prepare-se com as videoaulas e exercícios resolvidos criados a partir da grade da sua Universidade
Responda perguntas de provas passadas e avalie sua preparação.
Ganhe pontos para baixar
Ganhe pontos ajudando outros esrudantes ou compre um plano Premium
Comunidade
Peça ajuda à comunidade e tire suas dúvidas relacionadas ao estudo
Descubra as melhores universidades em seu país de acordo com os usuários da Docsity
Guias grátis
Baixe gratuitamente nossos guias de estudo, métodos para diminuir a ansiedade, dicas de TCC preparadas pelos professores da Docsity
Livro Completo
Tipologia: Notas de estudo
1 / 801
Esta página não é visível na pré-visualização
Não perca as partes importantes!
Introduction to Probability
Models
Tenth Edition
Sheldon M. Ross
AMSTERDAM •^ BOSTON •^ HEIDELBERG •^ LONDON NEW YORK •^ OXFORD •^ PARIS •^ SAN DIEGO SAN FRANCISCO •^ SINGAPORE •^ SYDNEY •^ TOKYO Academic Press is an Imprint of Elsevier
Academic Press is an imprint of Elsevier 30 Corporate Drive, Suite 400, Burlington, MA 01803, USA 525 B Street, Suite 1900, San Diego, California 92101-4495, USA Elsevier, The Boulevard, Langford Lane, Kidlington, Oxford, OX5 1GB, UK
Copyright © 2010 Elsevier Inc. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions.
This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein).
Notices Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods, professional practices, or medical treatment may become necessary.
Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information, methods, compounds, or experiments described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility.
To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein.
Library of Congress Cataloging-in-Publication Data Ross, Sheldon M. Introduction to probability models / Sheldon M. Ross. – 10th ed. p. cm. Includes bibliographical references and index. ISBN 978-0-12-375686-2 (hardcover : alk. paper) 1. Probabilities. I. Title. QA273.R84 2010 519.2–dc 2009040399
British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library.
ISBN: 978-0-12-375686-
For information on all Academic Press publications visit our Web site at www.elsevierdirect.com
Typeset by : diacriTech, India
Printed in the United States of America 09 10 11 9 8 7 6 5 4 3 2 1
vi Contents
xii Preface
the new Section 8.3.3 on birth and death queueing models. Section 11.8.2 gives a new approach that can be used to simulate the exact stationary distribution of a Markov chain that satisfies a certain property. Among the newly added examples are 1.11, which is concerned with a multiple player gambling problem; 3.20, which finds the variance in the matching rounds problem; 3.30, which deals with the characteristics of a random selection from a population; and 4.25, which deals with the stationary distribution of a Markov chain.
Course
Ideally, this text would be used in a one-year course in probability models. Other possible courses would be a one-semester course in introductory probability theory (involving Chapters 1–3 and parts of others) or a course in elementary stochastic processes. The textbook is designed to be flexible enough to be used in a variety of possible courses. For example, I have used Chapters 5 and 8, with smatterings from Chapters 4 and 6, as the basis of an introductory course in queueing theory.
Examples and Exercises
Many examples are worked out throughout the text, and there are also a large number of exercises to be solved by students. More than 100 of these exercises have been starred and their solutions provided at the end of the text. These starred problems can be used for independent study and test preparation. An Instructor’s Manual, containing solutions to all exercises, is available free to instructors who adopt the book for class.
Organization
Chapters 1 and 2 deal with basic ideas of probability theory. In Chapter 1 an axiomatic framework is presented, while in Chapter 2 the important concept of a random variable is introduced. Subsection 2.6.1 gives a simple derivation of the joint distribution of the sample mean and sample variance of a normal data sample. Chapter 3 is concerned with the subject matter of conditional probability and conditional expectation. “Conditioning” is one of the key tools of probability theory, and it is stressed throughout the book. When properly used, conditioning often enables us to easily solve problems that at first glance seem quite diffi- cult. The final section of this chapter presents applications to (1) a computer list problem, (2) a random graph, and (3) the Polya urn model and its relation to the Bose-Einstein distribution. Subsection 3.6.5 presents k -record values and the surprising Ignatov’s theorem.
Preface xiii
In Chapter 4 we come into contact with our first random, or stochastic, pro- cess, known as a Markov chain, which is widely applicable to the study of many real-world phenomena. Applications to genetics and production processes are presented. The concept of time reversibility is introduced and its usefulness illus- trated. Subsection 4.5.3 presents an analysis, based on random walk theory, of a probabilistic algorithm for the satisfiability problem. Section 4.6 deals with the mean times spent in transient states by a Markov chain. Section 4.9 introduces Markov chain Monte Carlo methods. In the final section we consider a model for optimally making decisions known as a Markovian decision process. In Chapter 5 we are concerned with a type of stochastic process known as a counting process. In particular, we study a kind of counting process known as a Poisson process. The intimate relationship between this process and the expo- nential distribution is discussed. New derivations for the Poisson and nonhomo- geneous Poisson processes are discussed. Examples relating to analyzing greedy algorithms, minimizing highway encounters, collecting coupons, and tracking the AIDS virus, as well as material on compound Poisson processes, are included in this chapter. Subsection 5.2.4 gives a simple derivation of the convolution of exponential random variables. Chapter 6 considers Markov chains in continuous time with an emphasis on birth and death models. Time reversibility is shown to be a useful concept, as it is in the study of discrete-time Markov chains. Section 6.7 presents the compu- tationally important technique of uniformization. Chapter 7, the renewal theory chapter, is concerned with a type of counting process more general than the Poisson. By making use of renewal reward pro- cesses, limiting results are obtained and applied to various fields. Section 7. presents new results concerning the distribution of time until a certain pattern occurs when a sequence of independent and identically distributed random vari- ables is observed. In Subsection 7.9.1, we show how renewal theory can be used to derive both the mean and the variance of the length of time until a specified pattern appears, as well as the mean time until one of a finite number of specified patterns appears. In Subsection 7.9.2, we suppose that the random variables are equally likely to take on any of m possible values, and compute an expression for the mean time until a run of m distinct values occurs. In Subsection 7.9.3, we suppose the random variables are continuous and derive an expression for the mean time until a run of m consecutive increasing values occurs. Chapter 8 deals with queueing, or waiting line, theory. After some prelimi- naries dealing with basic cost identities and types of limiting probabilities, we consider exponential queueing models and show how such models can be ana- lyzed. Included in the models we study is the important class known as a network of queues. We then study models in which some of the distributions are allowed to be arbitrary. Included are Subsection 8.6.3 dealing with an optimization problem concerning a single server, general service time queue, and Section 8.8, concerned with a single server, general service time queue in which the arrival source is a finite number of potential users.
Preface xv
Jean Lemaire, University of Pennsylvania Andrew Lim, University of California, Berkeley George Michailidis, University of Michigan Donald Minassian, Butler University Joseph Mitchell, State University of New York, Stony Brook Krzysztof Osfaszewski, University of Illinois Erol Pekoz, Boston University Evgeny Poletsky, Syracuse University James Propp, University of Massachusetts, Lowell Anthony Quas, University of Victoria Charles H. Roumeliotis, Proofreader David Scollnik, University of Calgary Mary Shepherd, Northwest Missouri State University Galen Shorack, University of Washington, Seattle Marcus Sommereder, Vienna University of Technology Osnat Stramer, University of Iowa Gabor Szekeley, Bowling Green State University Marlin Thomas, Purdue University Henk Tijms, Vrije University Zhenyuan Wang, University of Binghamton Ward Whitt, Columbia University Bo Xhang, Georgia University of Technology Julie Zhou, University of Victoria
This page intentionally left blank
2 Introduction to Probability Theory
Some examples are the following.
S = { H , T }
where H means that the outcome of the toss is a head and T that it is a tail.
S = {1, 2, 3, 4, 5, 6}
where the outcome i means that i appeared on the die, i = 1, 2, 3, 4, 5, 6.
S = {( H , H ), ( H , T ), ( T , H ), ( T , T )}
The outcome will be ( H , H ) if both coins come up heads; it will be ( H , T ) if the first coin comes up heads and the second comes up tails; it will be ( T , H ) if the first comes up tails and the second heads; and it will be ( T , T ) if both coins come up tails.
S =
⎧ ⎪⎪⎪ ⎪⎪⎪ ⎨ ⎪⎪ ⎪⎪⎪ ⎪⎩
(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6) (2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6) (3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6) (4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6) (5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6) (6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)
⎫ ⎪⎪⎪ ⎪⎪⎪ ⎬ ⎪⎪ ⎪⎪⎪ ⎪⎭
where the outcome ( i , j ) is said to occur if i appears on the first die and j on the second die.
S = [0, ∞)∗^
Any subset E of the sample space S is known as an event. Some examples of events are the following.
1 ′. In Example (1), if E = { H }, then E is the event that a head appears on the flip of the coin. Similarly, if E = { T }, then E would be the event that a tail appears. 2 ′. In Example (2), if E = { 1 }, then E is the event that one appears on the roll of the die. If E = {2, 4, 6}, then E would be the event that an even number appears on the roll.
∗ (^) The set ( a , b ) is defined to consist of all points x such that a < x < b. The set [ a , b ] is defined to consist of all points x such that a x b. The sets ( a , b ] and [ a , b ) are defined, respectively, to consist of all points x such that a < x b and all points x such that a x < b.
1.2 Sample Space and Events 3
3 ′. In Example (3), if E = {( H , H ), ( H , T )}, then E is the event that a head appears on the first coin. 4 ′. In Example (4), if E = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}, then E is the event that the sum of the dice equals seven. 5 ′. In Example (5), if E = (2, 6), then E is the event that the car lasts between two and six years. We say that the event E occurs when the outcome of the experiment lies in E. For any two events E and F of a sample space S we define the new event E ∪ F to consist of all outcomes that are either in E or in F or in both E and F. That is, the event E ∪ F will occur if either E or F occurs. For example, in (1) if E = { H } and F = { T }, then
E ∪ F = { H , T }
That is, E ∪ F would be the whole sample space S. In (2) if E = {1, 3, 5} and F = {1, 2, 3}, then
E ∪ F = {1, 2, 3, 5}
and thus E ∪ F would occur if the outcome of the die is 1 or 2 or 3 or 5. The event E ∪ F is often referred to as the union of the event E and the event F. For any two events E and F , we may also define the new event EF , sometimes written E ∩ F , and referred to as the intersection of E and F , as follows. EF consists of all outcomes which are both in E and in F. That is, the event EF will occur only if both E and F occur. For example, in (2) if E = {1, 3, 5} and F = {1, 2, 3}, then
EF = {1, 3}
and thus EF would occur if the outcome of the die is either 1 or 3. In Exam- ple (1) if E = { H } and F = { T }, then the event EF would not consist of any outcomes and hence could not occur. To give such an event a name, we shall refer to it as the null event and denote it by Ø. (That is, Ø refers to the event consisting of no outcomes.) If EF = Ø, then E and F are said to be mutually exclusive. We also define unions and intersections of more than two events in a simi- lar manner. If⋃ E 1 , E 2 ,... are events, then the union of these events, denoted by ∞ n = 1 E^ n , is defined to be the event that consists of all outcomes that are in^ En for at least one value of n = 1, 2,.... Similarly, the intersection of the events E (^) n , denoted by
n = 1 E^ n , is defined to be the event consisting of those outcomes that are in all of the events En , n = 1, 2,.... Finally, for any event E we define the new event Ec , referred to as the complement of E , to consist of all outcomes in the sample space S that are not in E. That is, E c^ will occur if and only if E does not occur. In Example (4)