Splitting a Poisson Process: M/G/Infinity Queue

1. Notation

  • M: “Markovian” or “Memoryless” arrival process (i.e., Poisson Process)
  • G: General service time (not necessarily exponential)
  • infty: Infinite number of servers
Let 
  • X(t) be the number of customers who have completed service by time t
  • Y(t) be the number of customers who are being served at time t
  • N(t) be the total number of customers who have arrived by time t

2. Splitting the arrival process

  • Fix a reference time T.
  • Consider the process of customers arriving prior to time T. 
  • A customer arriving at time t leq T is
    • Type I: if service is completed before T
      • occur with probability P_i(t) = G(T-t)
    • Type-II: if customer still is service at time T
      • occur with probability P_{II}(t) = G^c(T-t)
Since arrival times and services times are all independent, the type assignments are independent. Therefore, 
  • X(T) is a Poisson random variable with mean lambda int^T_0 P_I(t) dt = lambda int^T_0 G(T-t) dt = lambda int^T_0 G(t)dt.
  • Y(T) is a Poisson random variable with mean lambda int^T_0 P_{II}(t)dt = lambda int^T_0 G^c(T-t)dt = lambda int^T_0 G^c(t)dt
  • X(T) and Y(T) are independent
What happens when T to infty
  • G(t) approx 1 for large t. Therefore, X(T) is a Poisson random variable with mean lambda t
  • Y(T) is a Poisson random variable with mean lambda int^T_0 G^ct)dt = lambda E[G]
Summary: Number of customers in service in an M/G/infty queue, in steady state, is a Poisson random variable with mean lambda E[G].

3. Example

Compound Poisson Process (CPP)

1. Definition: 

Remove the restriction that two or more customers cannot arrive at the same time, (i.e., remove orderliness property)

Let N(t) be a Poisson Process with rate lambda, and let Y_i be the i.i.d random variable, then X(t) = sum^{N(t)}_{i=1} Y_i is a compound Poisson process

Example #1: Buses arrive according to a Poisson process. Let Y_i be the number of people on bus i, and let X(t) be the total number of people arriving by time t.

Example #2: Insurance claims arrive according to a Poisson process. Let Y_i be the size of the claim (in dollars), and let X(t) be the total amount due to all claims by time t.

2. Expectation: 

E[X(t)] = E[E[X(t)|N(t)]]
Since E[X(t)|N(t) = n] = E[sum^n_{i=1} Y_i] = nE[Y_i], i.e., E[X(t)|N(t)]  = N(t)E[Y_i]
So E[E[X(t)|N(t)]] = E[N(t)E[Y_i]] = E[N(t)]*E[Y_i] = lambda t E[Y_i]

3. Variance: 

var[X(t)] = var[E[X(t)|N(t)]] + E[var[X(t)|N(t)]]
and we have Var[X(t)|N(t)=n] = var(sum^n_{i=1} E[Y_i]) = nVar[Y_i]
or var(X(t)|N(t)) = N(t)Var(Y_i).
So Var[E[X(t)|N(t)]] + E[Var[X(t)|N(t)]]
  = Var[N(t)E(Y_i)] + E[N(t)Var[Y_i)]
  = lambda t E^2[Y_i] + lambda t Var(Y_i)
  = lambda t E^2[Y_i] + lambda t (E[(Y_i)^2] - E^2[Y_i])
  = lambda t E[(Y_i)^2]

4. Example: 



Non-Homogeneous Poisson Process (NPHH)

1. Properties:

  • N(0) = 0
  • N(t) has independent increments
  • Pr{N(t+h) - N(t)} = lambda(t)h + o(h)
  • Pr{N(t+h)-N(t) geq 2} = o(h)
Notes:
  • This is like a Poisson process, without the stationary assumption
A process with the above properties is a NHPP with intensity (or rate) function lambda(t)

2. Definition: 

The mean value function (for a NHPP) is

m(t) = int^t_0 lambda(u) du

3. Key Properties: 

  • For a NHPP, N(s+t) – N(s) is a Poisson random variable with mean m(s+t) - m(s)

The Poisson Process

1. Priliminary Definitions

Def : A stochastic process is a collection of random variable (RV) indexed by time {X(t), t in T}.

  • If T is continuous set, the process is a continuous time stochastic process (e.g., Poisson Process)
  • If T is countable, then the process is a discrete time stochastic process (e.g., Markov chain)
Def. A counting process is a stochastic process {N(t); t geq 0} such that
  • N(t) in {0,1,cdots, 2} (that is, N(t) is non-negative integer)
  • If s < t, the N(s) leq N(t) ( that is, N(t) is non-decreasing in t)
  • For s<t, N(t) - N(s) is the number of events occurring in the time interval (s,t].

Def: A counting process has stationary increments if the distribution of the number of events in an interval depends on the length of the interval, but not on the starting point of the interval. That is, P(N(s+t) - N(s) =  n) does not depend on s. Intuitively, the interval can be “slide” around without changing its stochastic nature.

2. Definition of Poisson Process

Definition 1: A Poisson process is a counting process {N(t); t geq 0} with rate lambda >0, if:

  • N(0) = 0;
  • The Process has independent increments
  • The number of events in any interval of length t is a Poisson RV with mean lambda t.
That is, for all s,t geq 0 and n = 0,1,2, cdots
P(N(s+t) - N(s) = n) = e^{-lambda t} (lambda t)^{n}/n!.

Definition 2: A Poisson process is a counting process {N(t); t geq 0} with rate lambda >0 if

  1. N(0) = 0
  2. The process has stationary increments
  3. The process has independent increments
  4. Pr{N(h) =1} = lambda h + o(h) (# of events approximately proportional to the length of interval)
  5. Pr{N(h) geq 2} = o(h) (can’t have 2 or more events at the same time — “orderliness”)
Eliminating of individual assumptions yields variations on the Poisson process
  • Eliminate assumption 2, Non-stationary Poisson Process
  • Eliminate assumption 3, Mixture of Poisson Process (choose lambda randomly, then run a Poisson process)
  • Eliminate assumption 5, compound Poisson Process
Definition 3: A Poisson process with rate lambda is a counting process such that times between events are i.i.d distribution exp(lambda)

3. Conditional Distribution of Event Times


4. Example

The Poisson Distribution

1. Definition of Poisson Random Variable

Def: A Poisson random variable with mean A and probability mass function: P(X=i) = E^{-A}frac{A^i}{i!}

  • mean: A
  • variance: A

2. Binomial Approximate Poisson

Let X_i be an indicator variable of either 0 or 1. And assume Pr(X_i = 1) = A/N for all i. Then E[X] = A.
Based on the assumptions, we have a binomial distribution:
P(sum^N_{i=1} X_i = k) = binom{N}{k}(A/N)^k(1-A/N)^{N-k}
                                            = frac{N!}{(N-k)!k!} (A/N)^k (1-A/N)^{N-k}
                                            = e^{-A} A^k/k! (A Poisson random variable)

3. Example

P(X_i =1) =1/365.
X = sum^{400}_{i=1} X_i is approximately Poisson with mean 400/365
Then P(X geq 2) = 1 - P(X<2) = 1-e^{-A} - Ae^{-A}

4. Priliminary Definitions

Def : A stochastic process is a collection of random variable (RV) indexed by time {X(t), t in T}.

  • If T is continuous set, the process is a continuous time stochastic process (e.g., Poisson Process)
  • If T is countable, then the process is a discrete time stochastic process (e.g., Markov chain)
Def. A counting process is a stochastic process {N(t); t geq 0} such that
  • N(t) in {0,1,cdots, 2} (that is, N(t) is non-negative integer)
  • If s < t, the N(s) leq N(t) ( that is, N(t) is non-decreasing in t)
  • For s<t, N(t) - N(s) is the number of events occurring in the time interval (s,t].
Def: A counting process has stationary increments if the distribution of the number of events in an interval depends on the length of the interval, but not on the starting point of the interval. That is, P(N(s+t) - N(s) =  n) does not depend on s. Intuitively, the interval can be “slide” around without changing its stochastic nature.

Exponential Distribution and Properties

– Stochastic Process course notes

1. Definition

  • Probability function: f(x) = lambda e^{-lambda x}, x geq 0
  • Cumulative Distribution Function (CDF): F(x) = 1-e^{-lambda x}, x geq 0
  • Complement of the CDF (CCDF): F^c(x) = e^{-lambda x}, x geq 0.

2. Memoryless Property

Def`1: A random variable X has the memoryless property if Pr{X>t+s| X>s} = Pr{X>t}

Def`2A random variable X has the memoryless property if Pr{X>t+s} = Pr{X>t} Pr{X>s}

The exponential distribution is the only distribution that has the memoryless property (Satisfy definition 2)

3. Useful Properties: First occurrence among events

Assume X_1, X_2, cdots, X_n are exponential variable with rate lambda_1, lambda_2, cdots, lambda_n.
Then what is the probability that X_1 < X_2.

More generally

Pr{X_i = min[X_1,cdots, X_n]} = frac{lambda_i}{lambda_1+lambda_2+cdots +lambda_n}


4. Distribution of time of first event

This is the CDF of an exponential RV with rate (lambda_1 + lambda_2), therefore
min(X_1, X_2) ~ exp(lambda_1 + lambda_2)

5. Distribution of time of last event (maximum)

Excess Distribution of Renewal Process

Excess Distribution of Renewal Process

– Course notes of Stochastic Process, 2014 Fall

1. Definition

Excess of renewal process is defined as Y(t) = S_{N(t)+1} -t (time until next event)
In the example of average time waiting bus, we drived
lim_{t to infty} frac{int^T_0 Y(u)du}{T} = frac{E[X_n]}{2} + frac{car[X_n]}{2E[X_n]}

Now we are going to derive Pr(X(t) >x) for a random t.
Interpretation: You show up “at random”. What is the probability that you wait more than x for the next event?

2. Derivation of Pr(X(t) > x)

As we want to determine the fraction of time that Y(t) > x
Let I(t) = 1 if Y(t) > x, and let I(t) = 0 otherwise.
Interpretation: Fraction of time that Y(t) > x = Fraction of “on” time for I(t)
Let Z_j be “on” time during cycle j, then Z_j = max(X_j - x, 0)
  • Z_j = X_j -x if X_j > x
  • Z_j = 0 otherwise

Note: the ON time and OFF time for each cycle are dependent. A longer ON time implies a shorter OFF time.
Then we have
E[Z_j] = E[max(X_j -x, 0)] = int^{infty}_0 Pr(max(X_j- x, 0) > u) du
                                               = int^{infty}_0 Pr{X_j - x > u} du
                                               = int^{infty}_0 Pr{X_j > x+ mu} dmu
                                               = int^{infty}_x Pr{X_j > mu} dmu
Since I(t) is an alternating renewal process, fraction of “on” time is
frac{E[Z_j]}{E[X_j]} = frac{1}{E[X_j]} int^{infty}_x F^c(u) dy
This is sometimes called equilibrium distribution,
                                                

3. Example of different distribution of X(t)

Example: X_j ~ exp(lambda)
Pr(Y(t) >x) = frac{1}{E[X_j} int^{infty}_{x} F^c(mu) dmu = frac{1}{lambda} int^{infty}_x e^{-lambda mu} dmu
                     = e^{-lambda x}
As expected (by memoryless property), excess distribution is an exponential distribution.
Example: Pareto Distribution 
Pr(X_j > x) = (1+x)^{-alpha}
alpha is also called the tail distribution
Some properties: E[X_j ] = frac{1}{alpha} (mean only exist if alpha >1
Pr(Y(t) > x) = frac{1}{E[X_j]} int^{infty}_x F^c(u) du = (1+x)^{-(alpha-1)}
Example: Deterministic X_j = D
– Assume F^c(mu) = 1 if mu leq D, otherwise F^c(mu) = 0.
Pr(Y(t) geq x) = frac{1}{E[X_j]} int^{infty}_x F^c(mu) dmu
                            = frac{1}{D} int^{D}_{x} d du
                            = 1 - frac{x}{D}
This is the CCDF of a uniform distribution on [0,D].
Note: the above assume that x leq D, If x > D, then the result is 0.

Alternating Renewal Process

Alternating Renewal Process

1. Definition

Consider a process X(t) with “on” periods and “off” periods
  • Let Z_j equiv ON time of cycle j
  • Let Y_j equiv OFF time of cycle j
Then Z_j and Y_j must satisfy the following property in order for X(t) to be regenerative
The pair (Z_j, Y_j) must be i.i.d; in particular, (Z_j, Y_j) independent of (Z_i, Y_i) for i neq j. That is
  • Z_j are i.i.d
  • Y_j are i.i.d
  • However, Z_j and Y_j may be independent for the same j
Cycle time X_j = Z_j + Y_j.
Then by the regenerative process, we have
the average “up” time = frac{mbox{average up time in a cycle}}{mbox{average time of one cycle}} = frac{E[Z_j]}{E[Z_j] + E[Y_j]}

2. Example

Cars pass a point on highway according to Poisson process with rate lambda = 2/min. 1/5 of cars are speeding (>10 mph over the pretend speed limit). Assume time to issue a ticket~UNIF[10,14] minutes(one officer)
Question: What fraction of speeding cars pass when the officer is busy.
Answer: The answer to the question is equivalent to the fraction of time the officer is busy.
  • Let Y_j =  time spent waiting to give a ticker
  • Let Z_j = time spent giving a ticker
Speeders arrive according to Poisson process with rate 2 cdot 1/5 = 2/5 per min.
Fraction of time busy = frac{E[Z_j]}{E[Z_j] + E[Y_j]} = frac{12}{12+5/2} = frac{12}{14.5}

[Stochastic Process] Black-Scholes Formula

1. Introduction

Motivation: Find the cost of buying options to prevent arbitrage opportunity.
Definition: Let x(s) be the price of a stock at time s, considered on a time horizon s in [0,t]. The  following actions are available:
  • At any time s, 0 leq s leq t, you can buy or sell shares of stock for X(s)
  • At time s = 0, there are N options available. The cost of option i is c_i per option which allows you to purchase 1 share at time t_i for price k_i.
Objective: the goal is to determine c_i so that no arbitrage opportunity exists.
Approach: By the arbitrage theorem, this requires finding a probability measure so that each bet has zero expected pay off.

We try the following probability measure on X(t): suppose that X(t) is geometric Brownian motion. That is, X(t) = x_0 e^Y(t), where Y(t) = sigma B(t) + mu t.

2. Analyze Bet 0: Purchase 1 share of stock at time s

Now, we consider a discount factor alpha. Then, the expected present value of the stock at time t is
E[e^{-alpha t} X(t) | X(s)] = e^{-alpha t} X(s) e { mu/(t-s) + frac{sigma^2(t-s)}{2}}

Choose mu, sigma such that alpha = mu + sigma^2/2

Then E[e^{-alpha t} X(t) | X(s)] = e^{-alpha t} X(s) e^{-alpha(t-s)} = e^{-alpha s} X(s).

Thus the expected present value of the stock at t equals the present value of the stock when it is purchased at time s. That is, the discount rate exactly equals to the expected rate of the stock returns.


3. Analyze Bet i: Purchase 1 share of option i (at time s =0)

First, we drop the subscript i to simplify notation.

Expected present value of the return on this bet is

- c + E[max (e^{alpha t} X(t)- k), 0]
= -c + E[e^{-alpha t} (X(t) - k)^+]

Setting this equal to zero implies that
ce^{alpha t}  = E[ (X(t) - k)^+] = E[(x_0 e^{Y(t)} - k)^+]
                         = int^{infty}_{-infty} (x_0 e^y - k)^+ frac{1}{2 pi sigma^2 t} e^{-frac{(y-ut)^2}{2 sigma^2 t}} dy

(x_0 e^y - k)+ has value 0 when x_0 e^{Y(t)} - k leq 0, i.e., when Y(t) leq ln(k/x_0)

Thus the integral becomes
ce^{alpha t}  = E[ (X(t) - k)^+] = E[(x_0 e^{Y(t)} - k)^+]
                         = int^{infty}_{ln k/x_0} (x_0 e^y - k)^+ frac{1}{2 pi sigma^2 t} e^{-frac{(y-ut)^2}{2 sigma^2 t}} dy

Now, apply a change of variables:
w = frac{y - mu t}{sigma sqrt(t)}, i.e.,  </span>sigma sqrt(t) w + mu t = y

4. Summary

No arbitrage exists if we find costs and a probability distribution such that expected outcome of every bet is 0.
  • If we suppose that the stock price follow geometric Brownian motion and we choose the option costs according to the above formula, then the expected outcome of every bet is 0.
  • Note: the stock price does not actually need to follow geometric Brownian motion. We are saying that if stock price follow Brownian motion, then the expected outcome of very bet would be 0, so no arbitrage exists.
A few sanity checks
  • When t = infty, cost of option is x_0
  • When t=0, cost of option is x_0 - k
  • As t increases, c increases
  • As k increases, c decreases
  • As x_0 increases, c increases
  • As sigma increase, c increase (assuming mu >0)
  • As alpha increases, c decreases