Special Random Processes

May 26, 2017 | Autor: Santhanam Krishnan | Categoria: Mathematics, Applied Mathematics
Share Embed


Descrição do Produto

Santhanam 1

Special Random Processes Author: K.Santhanam, M.Sc., M.Phil. Abstract Random processes are used in a variety of fields including economics, finance, engineering, bio-chemical, physics and other important areas. In my introductory chapter I have given some basic important definitions related to Random process, Correlation and Markov process. The Second chapter I have discussed about the Bernoulli Process and Binomial Process and their special properties. In the continuing chapter II I further explained about Poisson process and Gaussian Process and their properties. In Chapter III, I elaborated about Birth -Death Process and Pure Birth Process along with I have derived the differential equations of general Birth Process. In this connection I have taken Yale-Furry Process in my Chapter IV and derived its probability density function and probability generation function. Also I have given some notes about additional special process which are based on Birth-Death Process. Last I have concluded about the process.

Santhanam 2

Chapter I Introduction 1.1. Definition. Random Process A Random Process is the collection of random variable with time function. (i.e.) A Random Process is the collection of random variables {X(s, t)} where s є S, the sample space and t is the time (parameter). Example. 1.1.1. Consider the collection of random variables * outcome of the Clearly *

( )

+ , where

denotes the

throw of a die, the parameter t = 1, 2, 3 …

( )+ is a random process.

Example.1.1.2. Consider the collection of random variables {X (t)}, where X (t) is the number of Telephone calls (incoming) in the time interval (0, t). 1.2. Definition. Auto Correlation The Auto correlation of a random process {X (t)} is defined by (

)

, ( )

( )-, where

( )

( )- are any two random variables.

Santhanam 3

( )

The Auto correlation is also defined as (

It is denoted by

)

( )

, ( )

(

)-,

( )

1.2.1. Joint Auto Correlation. The joint auto correlation of two random processes {X (t)} and{Y (t)} is defined by (

)

, ( ) ( )-

, ( )

(

)-

1.2.2. Auto correlation for a complex valued function. The Auto correlation of a complex valued variable X(t) is defined by ( )

,

( )

(

)- , where

( ) is the complex conjugate of X (t).

1.3. Definition. Covariance If {X (t)} is a random process, then Covariance of X (t) is defined by (

)

(

)

, ( )- , ( )- , it is also denoted by

(

)

1.3.1. Definition. Joint Covariance If {X (t)} & {Y (t)} are two random process, then the joint Covariance of X (t) & Y (t) is defined by

(

)

(

)

, ( )- , ( )-

Santhanam 4

1.4. Definition. Stationary Process (Or) Strong Sense Stationary Process (Or) Strict Sense Stationary Process (Or) Strict Stationary Process (SSS). A random process {X (t)} is said to be Stationary (Or) Strict/Strong Sense Stationary process If it‟s statistical characteristics (or) properties do not change with time. (i.e.) the probability distribution does not change with time shift. (i.e.) E[X (t)] = E[X (t+ )- and Var[X (t)] = Var [X (t+ )-, for all (i.e.) E[X (t)] = constant and Var[X (t)] = constant.

1.5. Definition. Wide Sense Stationary process (Or) Weak Sense Stationary Process (Or) Covariance Stationary Process.(WSS)

A random process {X (t)} is said to be Wide Sense Stationary process (Or) Weak Sense Stationary Process (Or) Covariance Stationary Process. If the following conditions to be satisfied by it. 01. The mean of the random process E[X (t)] = constant 02. Its Auto correlation function is a function of time difference. (i.e.)

( )

(i.e.)

(

, ( ) )

(

, ( ) (

)- = function of τ and not t , (τ is the time difference) ( ))

(

),(

)

( )

Santhanam 5

Note .1.5.1. Every Strict sense stationary is Wide sense stationary. But the converse is not true. 1.6. Definition. Markov Process. If in a random process the future values depends upon only on the most recent past values and independent of more distance past values, then the process is called Markov Process. (i.e.) A random process {X(t)} is said to be Markovian, if , (

)

, (

⁄ ( ) )

If

⁄ ( )

(

)

( )

-

- , where states of the process respectively for the time

, then for a Morkovian, the future random process at time state

depends only on the random process at time

with state

is in the

and not on the

part states Example. 1.6.1. Consider the probability of raining today, which is depends only on previous day weather condition and not on past days or months. So this is a Morkovian.

Santhanam 6

Chapter II Bernoulli Process and Binomial Process 2.1. Definition. Binomial distribution For a binomial distribution, (

)

Where X - a random variable – Number of success in a random experiment p – Probability of success

q – Probability of failure (q= 1-p) n – Number of trails 2.2. Definition. Bernoulli Trail. In a random experiment, where there are only two outcomes for a trial, say a success with probability p and a failure with probability

, is called Bernoulli trail.

Example.2.2.1 Consider the random experiment of tossing a single fair coin and let us take getting a head is a success, then probability of success is

and the probability of failure

Santhanam 7

2.3. Definition. Bernoulli Process Consider a sequence of identically independent Bernoulli trails with only two events, probability of success p and q = 1-p, the probability of failure. Let *

+ be

the accountably infinite sequence of random variables such that

{

with probabilities ( Here *

)

(

)

+ represents the Bernoulli Process.

2.4. Statistics of Bernoulli Process 2.4.1. Mean. Mean = μ = (

)



(

)

(

)

(

)

2.4.2. Variance. Variance =

= Var(

)

Now ,



(

-

From (1) & (2), Variance =

,

)

=

* , -+ = , (

-

) (

----- (1) (

)

)

--- (2)

Santhanam 8

2.4.3. Auto Correlation. Auto Correlation = (

)

(

)

Now (

)

,

( )

(

,

( )

( )-

,

( )-

(Using (2)) ( )-

,

( )- ,

( )-

)

∴ Auto Correlation = (

)

{

2.3.4. Auto Covariance Auto Covariance = ( (

)

(

)

,

( )- ,

( )-

)

If

, (

)

If

, (

)

∴ Auto Covariance = (

(

) ,(

)

, ( (

{

2.4. Properties of Bernoulli Process. 2.4.1. Property.1. Bernoulli Process is a discrete random process

)

(

)

) )

Santhanam 9

2.4.2. Property.2. Bernoulli Process is Strict Sense Stationary (SSS) process Proof. We know that mean , And Variance

a constant

( )

– a constant

Hence Bernoulli Process is a SSS. 2.5. Binomial Process. If *

+ is the sequence of random variable in Bernoulli Process,

Define Then, *

, the partial sum of Bernoulli variates + is called Binomial Process.

2.6. Statistics of Binomial Process 2.6.1. Mean. Mean μ = , ( each

-

,∑

-



is Bernoulli and , -

Mean μ = np

, )



,

Santhanam 10

2.6.2. Variance. Variance = [

]

Now [ ∑ ∑

= Var(

)

,

-

* ,

-+

------ (1) , (∑

] ,

-



∑∑

) ∑

,



(



,∑

-

(

From (1) & (2), Variance = )

∴ Variance = 2.7. Properties of Binomial Process. 2.7.1. Property.1. Binomial process is a Markov Process

We know that

-

, - , - ,(

∑∑

Proof.



,∑

( ) (

( )

) )

) ---- (2)



-

Santhanam 11

(

|

)

(

)

(Since the number of success in n trials is equal to number of success in n-1 trails, hence we have

)

(

)

(

)

If the probability of success is p and probability of failure is q (=1-p), then we have (

|

(

(

)=

|

)

(

)

)

(

)

(Since the number of success in n trials is one less the number of success in n-1 trails, hence we have (

) )

(

)

Hence the conditional probability distribution of (i.e.) on ∴ It is a Markov Process

depends only on the previous

Santhanam 12

2.7.2. Property.2. Binomial process is SSS process Proof. We know that the Mean μ = np – a constant And variance

– a constant

Hence it is a SSS process.

Santhanam 13

Chapter III Poisson Process and Gaussian Process 3.1. Definition. Poisson distribution. The Poisson distribution is a discrete frequency distribution which gives the probability of a number of independent events occurring in a fixed time. If X is a discrete random variable and λ is rate of occurrence of an event in an interval. Then the probability of actual number occurrence r is given by (

)

(

)

, r = 0, 1, 2, … , which is the probability law or probability mass

function of a Poisson distribution. Here λt is called the parameter of the Poisson distribution. Examples 3.1.1. Number of typing errors on a page of a book Number of car accidents in a place for a year. Number of failure of a machine in one month.

Santhanam 14

3.2. Definition. Poisson Process The discrete random process {X (t)} defined in the time interval (0, t) is said to be Poisson Process with rate λ, If it satisfies the following postulates with respect to the small time interval length (i) P (1 occurrence in(

)) =

(ii) P (0 occurrence in(

)) =

(

(

(iii) (

))

) (

)

(

)

Also ( ) is independent of the number of occurrences of the event in any interval prior and after (0, t). The probability that the event occurs a specified number of times in ( dependent only on t and not

) is

.

3.2.1. Postulate (i) )) =

P (1 occurrence in( Where ( (

)

(

(

)

) is the function which is negligible when compare to )

(

)

(

)

as

Santhanam 15

Proof. ( ( )

We know that for Poisson,

(

Now ( (

0

=

(

, r =0, 1, 2…

)

(

)

) and higher powers of

(

(

( (

)

)

1,( by Using Taylor‟s series

is arbitrary small , (

Since ∴

))

)

(

)

))

) , where (

)

(

(

(

)

are neglected (

)

)

)

)

3.2.2. Postulate (ii) )) =

P (0 occurrence in(

(

)

Proof.

Now

( (

( Where (

)) = ( (

(

)

)

, (by Taylor‟s series) ) (

)

(

)

)

)

(

)

Santhanam 16

3.2.3. Postulate (iii) (

(

))

(

(

))

( (

)

Proof. ( , ( (

)

)

(

)

(

( (

)

)-

)

( (

)= (

,

)

)

) (

)-

).

Note 3.2.1. If the rate of occurrence λ is a constant , buy it can be a function of time t (λt), then the Poisson Process is called Homogeneous Poisson Process. We consider always a Poisson process is homogeneous until unless it is specified. 3.2.5. The probability law of Poisson

The probability law (function) of a Poisson process is given by

(

)

(

)

= 0, 1, 2, … , where λt is the parameter. 3.3. The Second order Probability Function of a Homogeneous Poisson Process. ( ( ) ( ( )

( ) ) (

)

( ( )

) ( ( )

| ( )

), )

,r

Santhanam 17

(

(

)

)(

))(

(

(

(

[

)

(

]

)

∴ ( ( )

(

[ (

)

)

(

)

)

]

)

, if

( )

(

[

)

{

)

(

(

)

]

)

3.3.1 The Third order Probability Function of a Homogeneous Poisson Process ( ( )

( )

[

{

( )

(

)

(

)(

(

)

)

]

)

3.4. Statistics of Poisson Process 3.4.1. Mean of Poisson Process We know that ( Mean μ = , ( )(

0 ∴ Mean μ =

)

(

) ∑

)

(

(

)

( )

( )

0

(

)

(

)

1

)

1

Santhanam 18

3.4.2. Variance of Poisson Process. ( ( ))

Variance

,

We know that , ( )Now ,

(

(

)

(

)

(

)

(

(

(

)

,

( )-

)

(

( )

(

)

)

(

(

)

(

)

(

)

)

)

(

( ) ,

1

( ) ,

)

)

(

)

(

)

)

(

0

* , ( )-+ ----- (1)

-------- (2)

( )- = ∑ (

( )-

(

)

)

-

( ) ---------- (3)

From (1),(2) & (3), Variance

( ( ))

( )

( )

3.4.3. Autocorrelation of Poisson Process. Auto Correlation (

)

, ( ) ( )-

, ( ) ( )

( )

( )-

Santhanam 19

, ( )* ( )

( )+

( )-

, ( )* ( )

( )+-

,

, ( )- , ( ) , (

( )-

) (

,

(

)

( )( )-

) (

)

Auto Correlation (

)

, if

(

)

, if

Auto Correlation

(

∴ Auto Correlation

)

(

)

3.4.4. Covariance of Poisson Process. We know that Covariance ( (

(

)

, ( )- , ( )-

)

( Covariance (

)

) )

( (

)

)

3.4.5. Correlation Coefficient of Poisson Process. Correlation Coefficient (

)

( √

, ( )-

)

( , ( )-



)

( √

)

Santhanam 20

Correlation Coefficient (

)

(

)



If

Correlation Coefficient (

)

If

Correlation Coefficient (

)









3.5. Definition. Inter arrival time The time interval between two successive occurrences of a process is called inter arrival time. Example: 3.5.1 The time between two consecutive arrivals of customers in a queue. The time between two consecutive arrivals of train in a station. The time between two consecutive successes in a random experiment. Note: 3.5.1 The Cumulative distribution (Cdf) of Poisson is defined by

( )

(

Note: 3.5.2 The Probability distribution function (Pdf) or mass function is defined by

)

Santhanam 21

( )

( )

.

Note: 3.5.3 The probability distribution function of exponential distribution with parameter λ is given ( )

by

Mean = and Variance =

3.6. Properties of Poisson Process 3.6.1. Property.1. Poisson process is a Markov Process Proof. We know that ( ( ) ( ( )

| ( )

( ( )

{

0 0

| ( )

(

)

(

)(

(

)

[

( )

( )

) ( ( )

( )

(

)

1

)

( (

( )

)

)=

) ( )

( ( )

( )

( ( )

( ) ( )

}

, (from 1.17 and 1.17.1)

1

)

(

)

(

)(

(

) )

]

( [

(

)

) (

)

)

]

)

)

Santhanam 22

(

[

(

)

(

(

)

(

)

)

]

------- (1)

)

0

But , ( )

(

[

| ( )

)

(

(

(

)

-=

( (

)

( )

( ( )

(

)

)

( (

)

)

1

) (

)

]

)

)

( ( )

(

)

)

--------- (2)

From (1) & (2) ( ( )

| ( )

( )

The conditional distribution of

)= , ( )

| ( )

-

( ) given all past values ( )

only on the latest past value ( ). Hence Poisson process is a Markov Process. 3.6.2. Property.2. (Additive) The sum of two independent Poisson process is a Poisson process.

( ) depends

Santhanam 23

Proof. Let *

( )+

*

( )+ be two Poisson processes with parameters

respectively. Let ( )

( )

Now Mean =

( )

, ( )-

,

(

* Since ∴

( )+

* *

( )

( )-

( )

* (

( )-

,

( )-

( )+

( )+

(

( )

( ) are independent ,

* ( )+

,

)

* ( )+

Now

( )

( )+

*

( )) (

( )

( ))

( )+

)

Hence for * ( )+, Mean = Variance =

(

)

We know that for Poisson process the Mean = Variance Hence * ( )+ is also a Poisson Process. 3.6.3. Property.3. The difference of two independent Poisson process is not a Poisson process.

Santhanam 24

Proof. Let *

( )+

( )+ be two Poisson processes with parameters

*

respectively. Let ( ) Mean =

( )

( )

, ( )-

,

( * ( )+ *



( ) *

( )

Since

( )-

,

,

* (

( )+

(

( )

( )+

*

( )) (

( )

( ))

( )+

)

We know that for Poisson process the Mean = Variance Here Mean Hence *

( )-

( )+

( ) are independent ,

* ( )+

( )-

) *

( )+

( )

Variance

( )+ is not a Poisson Process.

Santhanam 25

3.6.4. Property.4. The inter arrival time of a Poisson process with parameter λ is an exponential distribution with mean

Proof. Let

be two successive occurrence of a Poisson process with mean occurrence

rate (parameter) λ. Let

take place at time

occurrence of

and T be the (inter arrival time) time interval between the

.

Clearly T is a continuous random variable. (

)

(

)

(

)

(

) (

)

(

(

( ( ) ∴

(

) )

(

)

---------- (1)

))

Santhanam 26

We know that the Cumulative distribution function of Poisson with respect to the random variable T is given by (

) =

( )

(

)

, (using (1))

Hence the Probability distribution function ( ) ∴

( )

( )

(

)

(

)

Which the pdf of an exponential distribution

Inter arrival time of a Poisson process is an exponential distribution with parameter λ and mean .

3.6.5. Property.5. If the number of occurrence of an event E in an interval of length „t‟ is a poise process {X (t)} with parameter λt and if each occurrence of E has a constant probability „p‟ of being recorded and the recording are independent of each other, then the number N(t) of the recorded occurrences in t is also a Poisson process with parameter λp. Proof. ( ( )

)

(

in time t recorded = n)

P ( E occurs n + 1 times in t and n of them are recorded) or P ( E occurs n + 2 times in t and n of them are recorded) or

Santhanam 27

P ( E occurs n + 3 times in t and n of them are recorded) or …. P ( E occurs n + r times in t and n of them are recorded) or …. ∑

(



(

) ) (

) ( Recordings are independent)

( ( )

)



(



)

(

(

∴ ( ( )

)

)

{(

)

( )

(

( ( )

(

)

)

(

(



)

(



)

)

(

)

)

(

)

(

(

)

} , ( recording is Binomial & take

( )

( ( )

)(

=

)

(

)



,( ∑

)

)

N (t) is a Poisson process with parameter λp

(

)

(

(

)

)

=

)

)

Santhanam 28

Problem 3.6.1. Suppose that customers arrive at a bank according to a Poisson process with a mean rate of 3 per minute; find the probability that during a time interval of 2 min (i) exactly 4 customers arrive and (ii) more than 4 customers arrive. Solution. Given mean arrival rate λ = 3 per minute and the unit of time t = 2 minute We know that ( ( )

(i) ( ( )

( )(

)

( ) ( ( ) , ( ( ) ( )

) ) ( )

(

)

)

( ))

( )

( ( )

)

( ( )

)

( )

( )

( ( ) ( )

( )

( )( )

)

=

( ( )

)

( ( )

,

)-

Problem 3.6.2. If customer arrives at a counter in accordance with a Poisson process with a mean rate of 2 per minute, find the probability that the interval between two consecutive arrivals is

Santhanam 29

(i) more than 1 min, (ii) between 1 min to 2 min, and (iii) 4 min or less. Solution. Given λ = 2 min Here we have to find the probability between two consecutive arrival (i.e.) in inter arrival time T, which is a continuous random variable. By property 3.6.4, Inter arrival time is an exponential distribution with pdf ( ) (i) Probability of more than 1 min = (





)



( )



1

(ii) Probability between 1 min to 2 min = (

)





1

(iii) Probability of 4 min or less. = (

1

)

(

)



Santhanam 30

Problem 3.6.3. Queries presented in a computer data base are following a Poisson process of rate λ = 6 queries per minute. An experiment consists of monitoring the data base for m minutes and recording N (m) the number of queries presented. (i) What is the probability that no queries in a one minute interval? (ii) What is the probability that exactly 6 queries arriving in one minute interval? (iii) What is the probability of less than 3 queries arriving in a half minute interval? Solution. Given N (m) = number of queries presented in m min and λ = 6 queries per min Here P( ( )

(

)

)

(

)

(i) Probability that no queries in a one minute interval

( ( )

Here t = 1, r =0 ( )(

( ))

(ii) Probability that exactly 6 queries arriving in one minute interval ( ( )

)

( )(

( ))

( )

( )

)

Santhanam 31

(iii) Probability of less than 3 queries arriving in a half minute interval .

. /

( )

=

/

.

( )

. ( )/

. /

/

( )

. ( )/

( )

.

/

.

. /

/

. ( )/

(

. /

. /

)

3.7. Correlation Coefficient

Correlation coefficient

( ( ) If

( ))

( ( )) ( ( )

( ))

)



( ( )

( ( )) √

( ( ) ( ))

(

( ( )



( ( )

( ( )

( ( ) , then ( ( ) =



3.8. Definition. Normal (Or) Gaussian Process If

( )

( )

( ) are jointly normal for some set of

the real valued random process {X (t)} is called Normal (or) Gaussian Process. Gaussian process is completely specified on the first and second order moments (i.e.) the mean and variance.

, then

Santhanam 32

3.8.1. The first order density function of Gaussian Process (

Let

)

( ( )

.

( ))/

( ( ))/

.

(

Let ( ( )) | |

|

∴|

|

|

And co factor

(

∴ The pdf is

(

)

)



3.8.2. The second order density function of Gaussian Process Let ( ( ))

Let

Here

( ( ))

(

)

( ( ) ( ( )

( ))

( ( )

( ))

( ( )

( ))

( ))

( ( )) , where , where ( ( ))

)

Santhanam 33



(

)

(

)

Now | |

(

co factor | |

(

)

co factor | |

(

)

co factor | |

(

)

co factor | |

(

)

)

the pdf is given by (

(

)

(

) √

(

(

(

)



| | (

)(

))

(

)





| | (

)(

))

(

)(

)

(

)

√(

(



)

(

√(

)

(

(

)(

)

)(

)

(

)

(

)

(

(

)

(

)(

)

(

)

√(

( √(

(

)

(

)

(

(

)

(

)(

)

(

)

))

) )

)(

))

Santhanam 34

( √(

(

)

(

)

(

(

)

(

)(

)(

) (

(

)

)

))

)

( √(

(

(

(

)

(

)

))

)

)

(

)

(.

/

.

/.

/ .

/ )

)

√(

3.8.3. The nth order density function of Gaussian Process

Let

. ( )

( )/

| |

(

)

| | . The nth order density function of the Gaussian process if

given by

(

Where

)

| |

(√



) √| |

, ( )-

3.9. Properties of Gaussian Process 3.9.1. Property 1. If the Gaussian process is WSS then it is a SSS



| | (

)(

)

Santhanam 35

Proof. We know that the density function of nth order Gaussian Process is given by

(

)

(√





| | (

)(

)

) √| |

, ( )-

Where (

| |

. ( )

( )/ | |

)

| |

Given the Gaussian Process is WSS, then . ( )

( )/

(

The nth order density function (

) is same as the nth order density

function ( ( The time difference (

)

) )

(

)

).

Hence it is a SSS. 3.9.2. Property 2. If the member functions of a Gaussian process are uncorrelated, then they are independent.

Santhanam 36

Proof. Consider n member functions ( )

( )

( ) of the Gaussian process {X (t)}

If all are uncorrelated, then we have . ( )

( )/

. ( )

And

Hence we will get to

( )/

as a diagonal matrix with elements in the principal diagonal is equal

for all i | | | |

and | |

for all i

j

Hence the nth order density function of the Gaussian process is given by (

)

| |

(√



) √| |

(

∑ (√

) √

∑ (√

)

| | (



(

)

)

)(

)

Santhanam 37

(

∑ (√

)

{ (√

(

)

(

)

(

)

}

)

(

(√

)

)

(

)

(

)

)

(

)

(



(

)

(



) (

)

)



(

)

∴ The nth order density function ( )

= product of density functions of Hence ( )

( )

( )

( ) respectively.

( ) are independent.

Problem 3.9.1. If {X (t)} is a Gaussian Process with

( )

and

(

Find the probability that (i) (

, (ii) | (

)

( )|

)

)

|

Solution. Given * ( )+ is a Gaussian process, each ( ) is a normal random variable.

|

Santhanam 38

( )

for any t & ( (

In particular, when t = 10,

∴ (

(

|

)

(

(i) When ( )

)

(

)

)

( (

(

, )

)

(

for any

|

(

)

and

|

( )

We know that for normal variable

∴ ( (

|

)

) is a normal random variable with mean

variance,

|

)

, (

and when

|

)

( )

)

)

)

= = 0.5- 0.1915 (from Normal table)

Santhanam 39

= 0.3085 (ii) Now X (10) – X (6) is also a normal random variable with mean μ(10) –μ(6) = 10-10 =0 and variance ( (

))

(

)

(

)

(

)

|

|

|

|

|

|

( ( ))

( (

)

= Var [X(10) – X(6)] ( ))

5.6048

Here Z =

{[ (

)–

( )]

* (

}

When X (10) – X (6) = 4, Z = ∴ (|X (10) – X (6)|

( )

) )

+

(

)

( )

= 0.7137 (| |

)=

( (

)

) (

)

Santhanam 40

( = 2(0.2611) = 0.5222

)

Santhanam 41

Chapter IV Birth and Death Process and Pure Birth Process 4.1. Birth and Death process The birth and death process is a special case of continuous time Markov process, where the states represent a current size of a population and the transitions are limited to birth and death. When a birth occurs, the process goes from state i to state i + 1. Similarly, when death occurs, the process goes from state i to state i − 1. It is assumed that the birth and death events are independent of each other. The birth-and-death process is characterized by the birth rate {

}, i=0, 1..., and death

rate { }, i=0, 1..., which vary according to state i of the system. (i.e.) A birth and death process refers to a Markov process with - A discrete state space - The states of which can be enumerated with index i=0, 1, 2 … such that - State transitions can occur only between neighbouring states, i → i + 1 or i → i − 1

Transition rates For birth, state from i to i+1 =

Santhanam 42

For death, state from i to i-1 = For no change in the state, in the state i = 0 The probability of birth in interval (

) is

The probability of death in interval (

) is

The general description of the Birth and Death process can be as follows: After the process enters state i, it holds (sojourns) in the given state for some random length of time, exponentially distributed with parameter (

)

When leaving i, the process enters either i + 1 with probability Or i − 1 with probability Example 4.1.1 Communication Suppose that calls arrive at a single channel telephone exchange such that successive calls arrivals are independent exponential random variables. Suppose that a connection is realized if the incoming call finds an idle channel. If the channel is busy, then the incoming call joins the queue. When the caller is through, the next caller is connected. Assuming that the successive service times are independent exponential variables, the number of callers in the system at time t is described by a birth and death process.

Santhanam 43

Example 4.1.2. Biological field Theory of birth – and death processes provide a natural mathematical framework for modeling a variety of biological processes. Examples of these biological processes include population dynamics such as the spreading of infectious diseases, somatic evolution of cancers among others.

4.2. Pure Birth Process A Pure Birth Process is a birth and death process with

for all i.

A Pure birth process is a continuous time, discrete state Markov process. If {X ( )} is a random process with possible values of X (t) are non-negative integers. X (t) represents the population size at time t and the transitions are limited to birth. When a birth occurs, the process goes from state n to state i+1. If no birth occurs, the process remains at the current state i. The process cannot move from higher state to a lower state since there is no death. The birth process is characterized by the birth rate

which varies according to

the state i of the system. Example.4.2.1 Radioactivity Radioactive atoms are unstable and disintegrate stochastically. Each of the new atoms is also unstable. By the emission of radioactive particles these new atoms pass through a number of physical states with specified decay rates from one state to the adjacent. Thus radioactive transformation can be modeled as birth process.

Santhanam 44

Example.4.2.2 Industry Suppose that a number of automatic machines are serviced by an operator. Owing to random mistakes, the machines may break down and call for service. If we assume that the machines work independently and that the operator is busy if there is a machine in the waiting line and that the service times are identical and independent random variables. Furthermore, if we suppose that the service times are identically distributed, independent random variables with a known distribution function, and then such a case can also be modeled as a birth process. Example 4.2.3. The spread of new infections in cases of a disease where each new infection is considered as a birth. 4.2.1. Derivation of Differential equation of general pure birth process. The differential equations of birth process are given by ( )

( )

( )

( )

( )

Derivation Let X (t) = the population at time t. Let

( ) (

( ( ) )

Claim: To find

( ( ( )

) )

) (

)

( )

Santhanam 45

Assume the following postulates for the birth event when ( ) 1. P (1 Birth/occurrence in(

)) =

2. P (0 Birth/occurrence in(

)) =

( ) ( )

(

3. (

.

))

( )

( ) , where

and

)

is the rate of birth occurrence in(

If X (t) = n and there is no birth occurs in the interval (t, t + h), then

(

)

If X (t) = n-1 and there is a birth occurs in the interval (t, t + h), then

(

)

Hence (

)

() ()

{

(

) (

)

----------- (1) (



)

( (

)

)

( (

)

( )

)

( (

)

( )

) , (by using (1))

( (

)

( )

)

( (

)

( )

)

( (

)

( (

)

| ( )

) ( ( ) (

,

( )]

( )+ ,

)

) ( )

| ( )

) ( ( ) (

( )-

)

( ) , (Using the postulates1, 2)

) ( )

Santhanam 46

(

)

( )

(

)

,

( )- ( )

( )

,

( )-

( )

( ) ( )

( )

( )

( )

( )

Divide both sides by h, (

)

( )

( )

( )

( )

( )

( )

( )

( )

( )

( )

( )

( )

( )

Taking limit as h → 0 on both sides, (

( )

)

( )

( )

=

( )

( )

( )

Since

(

)

( )

( ) (

If

( )

(

( ) )

( ( )

( ) ( ) -------- (2)

when ( )

) ( (

( )

=

, there is no birth in (

)

)

( (

| ( )

)

( ( )

(

)

) ) ( )

( )

) ------- (3) ), (by using (3))

( )

Santhanam 47

(

) (

,

( )- ( )

)

( )

( )

( ) ( )

( )

( ) ( )

( )

Divide both sides by h, (

)

( )

( )

( )

( )

( )

( )

( )

Taking limit as h → 0 (

)

( )

( )

( )

( )

( )

Since

(

)

( )

( )

( )

( ) ------- (4)

Equations (2) and (4) are the differential equations of birth process.

Note 4.1.1 Geometric distribution. The Probability density function of geometric distribution is given by (

)

(

)

, where p is the probability of successes and r is the number of

trials. Mean = and variance =

Santhanam 48

Chapter V Yule- Furry Process and some special Birth and Death processes 5.1. Yule – Furry Process. This process is a special case of pure birth process. When

with initial conditions

( )

( )

, then the

process is called Yule –Furry process. (i.e.) When

with the process starts with only one member at time t = 0

5.1.1. Derivation of process

( ) and probability generation function of Yule- Furry

We know that the differential equations of pure birth process are given by ( )

( )

( )

( ) ------- (2)

When

( )

, (1) ( )

(2)

( ) ------ (1)

)

( ),

( ) by using mathematical induction method.

For n = 1, ( )

(

------ (4)

We are going to find

(3)

( )

( )

( ) ( )

Integrating both sides with respect to t,

--- (3)

Santhanam 49



( ) ( )



( )

, where

integral constant.

( ) ( ) ( )

, where c =

----- (5)

( )

( )

Using initial conditions, at t = 0, ( )



( )

∴ (5)

------- (6)

For n = 2,

(3)

( ) ( )

( )

( )=

( )

Multiply both sides by ( ) .

( ) ( )/

Integrating both sides with respect to t ∫ .

( )/



( )

, (by using (6))

Santhanam 50

( )

, where

( )

(i.e.)

is the integral constant

-------- (7) ( )

Using initial conditions at t = 0,

( )

( )

(i.e.) 0 = 1 + ∴ ( )

(7) ( )

(

( )

By mathematical induction let us assume

From (3),

( )

[

(

( ) (

( )

( )

[

( )

∴ (9)

( )

)

( )

( )] )

( )]

(

)

( )

(

(

)

)

(

)

) ------ (8)

(

)

----- (9)

Santhanam 51

Multiplying both sides by

( )

( )

(

(

)

)

(

)

.

( )/

(

)

(

)

(

)

(

)

(

)

(

)

(

)

(

)

(

)

.0

(

)

(

)

( )/

∴ .

.0

.

(

(

(

)

)

)

(

/

)(

(

)

)

1/

(

)

(

)

1/

Integrating both sides with respect to t,

( )/

∫ .

( )

(

∫ .0

)

(

)

(

( )

)

)

Using initial conditions at t = 0,

(10)

(

(

)

(

1/

, where ( )

( )

)

is the integral constant. ---- (10)

Santhanam 52

(

( )



)

(

)

(

)

,n≥1

( )

And

Clearly

( ) is a geometrical distribution with parameter

The probability generation function is given by

(

)



(

(

∴ P (s, t)

(

)

)

)

)

( if

Note: 5.1.1. Depending upon the values of

and

of the Birth and Death process, various

types of birth and death processes can be defined.

5.2. Pure Death Process A Pure Death Process is a birth and death process with

= 0 for all i.

Santhanam 53

5.3. Immigration process When,

in Birth and Death process, i.e.,

is independent of population size n,

then the increase in the population can be regarded as due to an external source. This process is immigration process. 5.4. Emigration Process When

in Birth and Death process, i.e.,

is independent of population size n,

then the decrease in the population can be regarded as due to elimination of some elements present in the population. The process is an emigration process. 5.5. Linear birth Process When,

in Birth and Death process, then is the conditional

probability of one birth in an interval of length h, given that n organisms are present at the beginning of the interval.

s the birth rate in a unit interval per

organism. 5.6. Linear death Process When process.

in Birth and Death process, then the process is known as a linear death

Santhanam 54

Chapter VI Conclusion In many real life situations, observations are made over a period of time and they are influenced by random effects, not just at a single instant but throughout the entire interval of time or sequence of times. Random Processes are playing a vital role in all the fields including digital communication system. Based on the essential aspects of Random process we can conclude that it plays an important part in the latest development in almost every filed of activities. So I am working in depth in these topics to provide further valuable ideas.

Santhanam 55

Bibliography Bhattacharya, R.N. and Waymire, E. Stochastic Processes with Application. New York John wiley and Sons.1990.Print. Ewens, Warren,”Mathematical Population Genetics”, Second edition, Springer Science & Business Media.2004. Print. Gani, J. and Stals, L., “Population Processes Subject to Mass Movement” International Journal of Pure and Applied Mathematics. Vol. 43 no. 3.2008.Print. Janardan, K.G., “Integral Representation of a Distribution Associated with a Pure Birth Process”. Communication in Statistics-Theory and Methods,.2003.Print. Karlin, S and Taylor, H. M., A First Course in Stochastic Processes. Second Edition. Academic Press.1975.Print. Medhi, J., Stochastic Processes, Third edition. New Age International (P) Limited Publishers.2009. Print. Resnick , Sidney, ”Adventures in Stochastic Processes”, Birkhäuser publisher ;1992. Print. Taylor, Howard and Karlin, Samuel, “An Introduction to Stochastic Modeling”, Gulf Professional Publishing, Third edition, 1998.Print.

Lihat lebih banyak...

Comentários

Copyright © 2017 DADOSPDF Inc.