Docsity
Docsity

Prepare-se para as provas
Prepare-se para as provas

Estude fácil! Tem muito documento disponível na Docsity


Ganhe pontos para baixar
Ganhe pontos para baixar

Ganhe pontos ajudando outros esrudantes ou compre um plano Premium


Guias e Dicas
Guias e Dicas

Introduction to Probability Model - S Ross - Math , Notas de estudo de Estatística

Livro Completo

Tipologia: Notas de estudo

2018

Compartilhado em 10/04/2018

fernanda-mesquita-18
fernanda-mesquita-18 🇧🇷

4

(1)

2 documentos

1 / 801

Toggle sidebar

Esta página não é visível na pré-visualização

Não perca as partes importantes!

bg1
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c
pf2d
pf2e
pf2f
pf30
pf31
pf32
pf33
pf34
pf35
pf36
pf37
pf38
pf39
pf3a
pf3b
pf3c
pf3d
pf3e
pf3f
pf40
pf41
pf42
pf43
pf44
pf45
pf46
pf47
pf48
pf49
pf4a
pf4b
pf4c
pf4d
pf4e
pf4f
pf50
pf51
pf52
pf53
pf54
pf55
pf56
pf57
pf58
pf59
pf5a
pf5b
pf5c
pf5d
pf5e
pf5f
pf60
pf61
pf62
pf63
pf64

Pré-visualização parcial do texto

Baixe Introduction to Probability Model - S Ross - Math e outras Notas de estudo em PDF para Estatística, somente na Docsity!

Introduction to Probability Models

Tenth Edition

Introduction to Probability

Models

Tenth Edition

Sheldon M. Ross

University of Southern California

Los Angeles, California

AMSTERDAM •^ BOSTON •^ HEIDELBERG •^ LONDON NEW YORK •^ OXFORD •^ PARIS •^ SAN DIEGO SAN FRANCISCO •^ SINGAPORE •^ SYDNEY •^ TOKYO Academic Press is an Imprint of Elsevier

Academic Press is an imprint of Elsevier 30 Corporate Drive, Suite 400, Burlington, MA 01803, USA 525 B Street, Suite 1900, San Diego, California 92101-4495, USA Elsevier, The Boulevard, Langford Lane, Kidlington, Oxford, OX5 1GB, UK

Copyright © 2010 Elsevier Inc. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions.

This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein).

Notices Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods, professional practices, or medical treatment may become necessary.

Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information, methods, compounds, or experiments described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility.

To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein.

Library of Congress Cataloging-in-Publication Data Ross, Sheldon M. Introduction to probability models / Sheldon M. Ross. – 10th ed. p. cm. Includes bibliographical references and index. ISBN 978-0-12-375686-2 (hardcover : alk. paper) 1. Probabilities. I. Title. QA273.R84 2010 519.2–dc 2009040399

British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library.

ISBN: 978-0-12-375686-

For information on all Academic Press publications visit our Web site at www.elsevierdirect.com

Typeset by : diacriTech, India

Printed in the United States of America 09 10 11 9 8 7 6 5 4 3 2 1

vi Contents

  • 1 Introduction to Probability Theory Preface xi
    • 1.1 Introduction
    • 1.2 Sample Space and Events
    • 1.3 Probabilities Defined on Events
    • 1.4 Conditional Probabilities
    • 1.5 Independent Events
    • 1.6 Bayes’ Formula
    • Exercises
    • References
  • 2 Random Variables
    • 2.1 Random Variables
    • 2.2 Discrete Random Variables
      • 2.2.1 The Bernoulli Random Variable
      • 2.2.2 The Binomial Random Variable
      • 2.2.3 The Geometric Random Variable
      • 2.2.4 The Poisson Random Variable
    • 2.3 Continuous Random Variables
      • 2.3.1 The Uniform Random Variable
      • 2.3.2 Exponential Random Variables
      • 2.3.3 Gamma Random Variables
      • 2.3.4 Normal Random Variables
    • 2.4 Expectation of a Random Variable
      • 2.4.1 The Discrete Case
      • 2.4.2 The Continuous Case
      • 2.4.3 Expectation of a Function of a Random Variable
    • 2.5 Jointly Distributed Random Variables
      • 2.5.1 Joint Distribution Functions
      • 2.5.2 Independent Random Variables
      • 2.5.3 Covariance and Variance of Sums of Random Variables
        • Variables 2.5.4 Joint Probability Distribution of Functions of Random
    • 2.6 Moment Generating Functions - Variance from a Normal Population 2.6.1 The Joint Distribution of the Sample Mean and Sample
    • 2.7 The Distribution of the Number of Events that Occur
    • 2.8 Limit Theorems
    • 2.9 Stochastic Processes
    • Exercises
    • References
  • 3 Conditional Probability and Conditional Expectation
    • 3.1 Introduction
    • 3.2 The Discrete Case
    • 3.3 The Continuous Case
    • 3.4 Computing Expectations by Conditioning
      • 3.4.1 Computing Variances by Conditioning
    • 3.5 Computing Probabilities by Conditioning
    • 3.6 Some Applications
      • 3.6.1 A List Model
      • 3.6.2 A Random Graph
        • Statistics 3.6.3 Uniform Priors, Polya’s Urn Model, and Bose–Einstein
      • 3.6.4 Mean Time for Patterns
      • 3.6.5 The k -Record Values of Discrete Random Variables
      • 3.6.6 Left Skip Free Random Walks
    • 3.7 An Identity for Compound Random Variables
      • 3.7.1 Poisson Compounding Distribution
      • 3.7.2 Binomial Compounding Distribution
        • Binomial 3.7.3 A Compounding Distribution Related to the Negative
    • Exercises
  • 4 Markov Chains
    • 4.1 Introduction
    • 4.2 Chapman–Kolmogorov Equations
    • 4.3 Classification of States
    • 4.4 Limiting Probabilities
    • 4.5 Some Applications
      • 4.5.1 The Gambler’s Ruin Problem
      • 4.5.2 A Model for Algorithmic Efficiency
        • Algorithm for the Satisfiability Problem 4.5.3 Using a Random Walk to Analyze a Probabilistic
    • 4.6 Mean Time Spent in Transient States
    • 4.7 Branching Processes
    • 4.8 Time Reversible Markov Chains Contents vii
    • 4.9 Markov Chain Monte Carlo Methods
    • 4.10 Markov Decision Processes
    • 4.11 Hidden Markov Chains
      • 4.11.1 Predicting the States
    • Exercises
    • References
  • 5 The Exponential Distribution and the Poisson Process
    • 5.1 Introduction
    • 5.2 The Exponential Distribution
      • 5.2.1 Definition
      • 5.2.2 Properties of the Exponential Distribution
      • 5.2.3 Further Properties of the Exponential Distribution
      • 5.2.4 Convolutions of Exponential Random Variables
    • 5.3 The Poisson Process
      • 5.3.1 Counting Processes
      • 5.3.2 Definition of the Poisson Process
      • 5.3.3 Interarrival and Waiting Time Distributions
      • 5.3.4 Further Properties of Poisson Processes
      • 5.3.5 Conditional Distribution of the Arrival Times
      • 5.3.6 Estimating Software Reliability
    • 5.4 Generalizations of the Poisson Process
      • 5.4.1 Nonhomogeneous Poisson Process
      • 5.4.2 Compound Poisson Process
      • 5.4.3 Conditional or Mixed Poisson Processes
    • Exercises
    • References
  • 6 Continuous-Time Markov Chains
    • 6.1 Introduction
    • 6.2 Continuous-Time Markov Chains
    • 6.3 Birth and Death Processes
    • 6.4 The Transition Probability Function P ij ( t )
    • 6.5 Limiting Probabilities
    • 6.6 Time Reversibility
    • 6.7 Uniformization
    • 6.8 Computing the Transition Probabilities
    • Exercises
    • References
  • 7 Renewal Theory and Its Applications
    • 7.1 Introduction
    • 7.2 Distribution of N ( t )
    • 7.3 Limit Theorems and Their Applications
    • 7.4 Renewal Reward Processes viii Contents
    • 7.5 Regenerative Processes
      • 7.5.1 Alternating Renewal Processes
    • 7.6 Semi-Markov Processes
    • 7.7 The Inspection Paradox
    • 7.8 Computing the Renewal Function
    • 7.9 Applications to Patterns
      • 7.9.1 Patterns of Discrete Random Variables
        • Distinct Values 7.9.2 The Expected Time to a Maximal Run of
      • 7.9.3 Increasing Runs of Continuous Random Variables
    • 7.10 The Insurance Ruin Problem
    • Exercises
    • References
  • 8 Queueing Theory
    • 8.1 Introduction
    • 8.2 Preliminaries
      • 8.2.1 Cost Equations
      • 8.2.2 Steady-State Probabilities
    • 8.3 Exponential Models
      • 8.3.1 A Single-Server Exponential Queueing System
        • Finite Capacity 8.3.2 A Single-Server Exponential Queueing System Having
      • 8.3.3 Birth and Death Queueing Models
      • 8.3.4 A Shoe Shine Shop
      • 8.3.5 A Queueing System with Bulk Service
    • 8.4 Network of Queues
      • 8.4.1 Open Systems
      • 8.4.2 Closed Systems
    • 8.5 The System M / G /
      • 8.5.1 Preliminaries: Work and Another Cost Identity
      • 8.5.2 Application of Work to M / G /
      • 8.5.3 Busy Periods
    • 8.6 Variations on the M / G /
      • 8.6.1 The M / G /1 with Random-Sized Batch Arrivals
      • 8.6.2 Priority Queues
      • 8.6.3 An M / G /1 Optimization Example
      • 8.6.4 The M / G /1 Queue with Server Breakdown
    • 8.7 The Model G / M /
      • 8.7.1 The G / M /1 Busy and Idle Periods
    • 8.8 A Finite Source Model
    • 8.9 Multiserver Queues
      • 8.9.1 Erlang’s Loss System
      • 8.9.2 The M / M / k Queue Contents ix
      • 8.9.3 The G / M / k Queue
      • 8.9.4 The M / G / k Queue
    • Exercises
    • References
  • 9 Reliability Theory
    • 9.1 Introduction
    • 9.2 Structure Functions
      • 9.2.1 Minimal Path and Minimal Cut Sets
    • 9.3 Reliability of Systems of Independent Components
    • 9.4 Bounds on the Reliability Function
      • 9.4.1 Method of Inclusion and Exclusion
      • 9.4.2 Second Method for Obtaining Bounds on r ( p )
    • 9.5 System Life as a Function of Component Lives
    • 9.6 Expected System Lifetime - Parallel System 9.6.1 An Upper Bound on the Expected Life of a
    • 9.7 Systems with Repair
      • 9.7.1 A Series Model with Suspended Animation
    • Exercises
    • References
  • 10 Brownian Motion and Stationary Processes
    • 10.1 Brownian Motion - Ruin Problem 10.2 Hitting Times, Maximum Variable, and the Gambler’s
    • 10.3 Variations on Brownian Motion
      • 10.3.1 Brownian Motion with Drift
      • 10.3.2 Geometric Brownian Motion
    • 10.4 Pricing Stock Options
      • 10.4.1 An Example in Options Pricing
      • 10.4.2 The Arbitrage Theorem
      • 10.4.3 The Black-Scholes Option Pricing Formula
    • 10.5 White Noise
    • 10.6 Gaussian Processes
    • 10.7 Stationary and Weakly Stationary Processes
    • 10.8 Harmonic Analysis of Weakly Stationary Processes
    • Exercises
    • References
  • 11 Simulation
    • 11.1 Introduction - Variables 11.2 General Techniques for Simulating Continuous Random
      • 11.2.1 The Inverse Transformation Method x Contents
      • 11.2.2 The Rejection Method
      • 11.2.3 The Hazard Rate Method
      • Variables 11.3 Special Techniques for Simulating Continuous Random
      • 11.3.1 The Normal Distribution
      • 11.3.2 The Gamma Distribution
      • 11.3.3 The Chi-Squared Distribution
      • 11.3.4 The Beta ( n , m ) Distribution
        • Algorithm 11.3.5 The Exponential Distribution—The Von Neumann
    • 11.4 Simulating from Discrete Distributions
      • 11.4.1 The Alias Method
    • 11.5 Stochastic Processes
      • 11.5.1 Simulating a Nonhomogeneous Poisson Process
      • 11.5.2 Simulating a Two-Dimensional Poisson Process
    • 11.6 Variance Reduction Techniques
      • 11.6.1 Use of Antithetic Variables
      • 11.6.2 Variance Reduction by Conditioning
      • 11.6.3 Control Variates
      • 11.6.4 Importance Sampling
    • 11.7 Determining the Number of Runs
      • Markov Chain 11.8 Generating from the Stationary Distribution of a
      • 11.8.1 Coupling from the Past
      • 11.8.2 Another Approach
    • Exercises
    • References
  • Appendix: Solutions to Starred Exercises
  • Index

xii Preface

the new Section 8.3.3 on birth and death queueing models. Section 11.8.2 gives a new approach that can be used to simulate the exact stationary distribution of a Markov chain that satisfies a certain property. Among the newly added examples are 1.11, which is concerned with a multiple player gambling problem; 3.20, which finds the variance in the matching rounds problem; 3.30, which deals with the characteristics of a random selection from a population; and 4.25, which deals with the stationary distribution of a Markov chain.

Course

Ideally, this text would be used in a one-year course in probability models. Other possible courses would be a one-semester course in introductory probability theory (involving Chapters 1–3 and parts of others) or a course in elementary stochastic processes. The textbook is designed to be flexible enough to be used in a variety of possible courses. For example, I have used Chapters 5 and 8, with smatterings from Chapters 4 and 6, as the basis of an introductory course in queueing theory.

Examples and Exercises

Many examples are worked out throughout the text, and there are also a large number of exercises to be solved by students. More than 100 of these exercises have been starred and their solutions provided at the end of the text. These starred problems can be used for independent study and test preparation. An Instructor’s Manual, containing solutions to all exercises, is available free to instructors who adopt the book for class.

Organization

Chapters 1 and 2 deal with basic ideas of probability theory. In Chapter 1 an axiomatic framework is presented, while in Chapter 2 the important concept of a random variable is introduced. Subsection 2.6.1 gives a simple derivation of the joint distribution of the sample mean and sample variance of a normal data sample. Chapter 3 is concerned with the subject matter of conditional probability and conditional expectation. “Conditioning” is one of the key tools of probability theory, and it is stressed throughout the book. When properly used, conditioning often enables us to easily solve problems that at first glance seem quite diffi- cult. The final section of this chapter presents applications to (1) a computer list problem, (2) a random graph, and (3) the Polya urn model and its relation to the Bose-Einstein distribution. Subsection 3.6.5 presents k -record values and the surprising Ignatov’s theorem.

Preface xiii

In Chapter 4 we come into contact with our first random, or stochastic, pro- cess, known as a Markov chain, which is widely applicable to the study of many real-world phenomena. Applications to genetics and production processes are presented. The concept of time reversibility is introduced and its usefulness illus- trated. Subsection 4.5.3 presents an analysis, based on random walk theory, of a probabilistic algorithm for the satisfiability problem. Section 4.6 deals with the mean times spent in transient states by a Markov chain. Section 4.9 introduces Markov chain Monte Carlo methods. In the final section we consider a model for optimally making decisions known as a Markovian decision process. In Chapter 5 we are concerned with a type of stochastic process known as a counting process. In particular, we study a kind of counting process known as a Poisson process. The intimate relationship between this process and the expo- nential distribution is discussed. New derivations for the Poisson and nonhomo- geneous Poisson processes are discussed. Examples relating to analyzing greedy algorithms, minimizing highway encounters, collecting coupons, and tracking the AIDS virus, as well as material on compound Poisson processes, are included in this chapter. Subsection 5.2.4 gives a simple derivation of the convolution of exponential random variables. Chapter 6 considers Markov chains in continuous time with an emphasis on birth and death models. Time reversibility is shown to be a useful concept, as it is in the study of discrete-time Markov chains. Section 6.7 presents the compu- tationally important technique of uniformization. Chapter 7, the renewal theory chapter, is concerned with a type of counting process more general than the Poisson. By making use of renewal reward pro- cesses, limiting results are obtained and applied to various fields. Section 7. presents new results concerning the distribution of time until a certain pattern occurs when a sequence of independent and identically distributed random vari- ables is observed. In Subsection 7.9.1, we show how renewal theory can be used to derive both the mean and the variance of the length of time until a specified pattern appears, as well as the mean time until one of a finite number of specified patterns appears. In Subsection 7.9.2, we suppose that the random variables are equally likely to take on any of m possible values, and compute an expression for the mean time until a run of m distinct values occurs. In Subsection 7.9.3, we suppose the random variables are continuous and derive an expression for the mean time until a run of m consecutive increasing values occurs. Chapter 8 deals with queueing, or waiting line, theory. After some prelimi- naries dealing with basic cost identities and types of limiting probabilities, we consider exponential queueing models and show how such models can be ana- lyzed. Included in the models we study is the important class known as a network of queues. We then study models in which some of the distributions are allowed to be arbitrary. Included are Subsection 8.6.3 dealing with an optimization problem concerning a single server, general service time queue, and Section 8.8, concerned with a single server, general service time queue in which the arrival source is a finite number of potential users.

Preface xv

Jean Lemaire, University of Pennsylvania Andrew Lim, University of California, Berkeley George Michailidis, University of Michigan Donald Minassian, Butler University Joseph Mitchell, State University of New York, Stony Brook Krzysztof Osfaszewski, University of Illinois Erol Pekoz, Boston University Evgeny Poletsky, Syracuse University James Propp, University of Massachusetts, Lowell Anthony Quas, University of Victoria Charles H. Roumeliotis, Proofreader David Scollnik, University of Calgary Mary Shepherd, Northwest Missouri State University Galen Shorack, University of Washington, Seattle Marcus Sommereder, Vienna University of Technology Osnat Stramer, University of Iowa Gabor Szekeley, Bowling Green State University Marlin Thomas, Purdue University Henk Tijms, Vrije University Zhenyuan Wang, University of Binghamton Ward Whitt, Columbia University Bo Xhang, Georgia University of Technology Julie Zhou, University of Victoria

This page intentionally left blank

2 Introduction to Probability Theory

Some examples are the following.

  1. If the experiment consists of the flipping of a coin, then

S = { H , T }

where H means that the outcome of the toss is a head and T that it is a tail.

  1. If the experiment consists of rolling a die, then the sample space is

S = {1, 2, 3, 4, 5, 6}

where the outcome i means that i appeared on the die, i = 1, 2, 3, 4, 5, 6.

  1. If the experiments consists of flipping two coins, then the sample space consists of the following four points:

S = {( H , H ), ( H , T ), ( T , H ), ( T , T )}

The outcome will be ( H , H ) if both coins come up heads; it will be ( H , T ) if the first coin comes up heads and the second comes up tails; it will be ( T , H ) if the first comes up tails and the second heads; and it will be ( T , T ) if both coins come up tails.

  1. If the experiment consists of rolling two dice, then the sample space consists of the following 36 points:

S =

⎧ ⎪⎪⎪ ⎪⎪⎪ ⎨ ⎪⎪ ⎪⎪⎪ ⎪⎩

(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6) (2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6) (3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6) (4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6) (5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6) (6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)

⎫ ⎪⎪⎪ ⎪⎪⎪ ⎬ ⎪⎪ ⎪⎪⎪ ⎪⎭

where the outcome ( i , j ) is said to occur if i appears on the first die and j on the second die.

  1. If the experiment consists of measuring the lifetime of a car, then the sample space consists of all nonnegative real numbers. That is,

S = [0, ∞)∗^ 

Any subset E of the sample space S is known as an event. Some examples of events are the following.

1 ′. In Example (1), if E = { H }, then E is the event that a head appears on the flip of the coin. Similarly, if E = { T }, then E would be the event that a tail appears. 2 ′. In Example (2), if E = { 1 }, then E is the event that one appears on the roll of the die. If E = {2, 4, 6}, then E would be the event that an even number appears on the roll.

∗ (^) The set ( a , b ) is defined to consist of all points x such that a < x < b. The set [ a , b ] is defined to consist of all points x such that a  x  b. The sets ( a , b ] and [ a , b ) are defined, respectively, to consist of all points x such that a < x  b and all points x such that a  x < b.

1.2 Sample Space and Events 3

3 ′. In Example (3), if E = {( H , H ), ( H , T )}, then E is the event that a head appears on the first coin. 4 ′. In Example (4), if E = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}, then E is the event that the sum of the dice equals seven. 5 ′. In Example (5), if E = (2, 6), then E is the event that the car lasts between two and six years.  We say that the event E occurs when the outcome of the experiment lies in E. For any two events E and F of a sample space S we define the new event EF to consist of all outcomes that are either in E or in F or in both E and F. That is, the event EF will occur if either E or F occurs. For example, in (1) if E = { H } and F = { T }, then

EF = { H , T }

That is, EF would be the whole sample space S. In (2) if E = {1, 3, 5} and F = {1, 2, 3}, then

EF = {1, 2, 3, 5}

and thus EF would occur if the outcome of the die is 1 or 2 or 3 or 5. The event EF is often referred to as the union of the event E and the event F. For any two events E and F , we may also define the new event EF , sometimes written EF , and referred to as the intersection of E and F , as follows. EF consists of all outcomes which are both in E and in F. That is, the event EF will occur only if both E and F occur. For example, in (2) if E = {1, 3, 5} and F = {1, 2, 3}, then

EF = {1, 3}

and thus EF would occur if the outcome of the die is either 1 or 3. In Exam- ple (1) if E = { H } and F = { T }, then the event EF would not consist of any outcomes and hence could not occur. To give such an event a name, we shall refer to it as the null event and denote it by Ø. (That is, Ø refers to the event consisting of no outcomes.) If EF = Ø, then E and F are said to be mutually exclusive. We also define unions and intersections of more than two events in a simi- lar manner. If⋃ E 1 , E 2 ,... are events, then the union of these events, denoted by ∞ n = 1 E^ n , is defined to be the event that consists of all outcomes that are in^ En for at least one value of n = 1, 2,.... Similarly, the intersection of the events E (^) n , denoted by

n = 1 E^ n , is defined to be the event consisting of those outcomes that are in all of the events En , n = 1, 2,.... Finally, for any event E we define the new event Ec , referred to as the complement of E , to consist of all outcomes in the sample space S that are not in E. That is, E c^ will occur if and only if E does not occur. In Example (4)