On a Continuous-Time Martingale and Two Applications

We construct a continuous-time martingale to analyze two queueing systems: One is the GIX/M/1 queue with light-tailed batch arrivals for which we obtain the first exact result in closed-form when X has a Geometric distribution, and stochastic bounds otherwise. The second application concerns queues with Semi-Markovian (SM) inter-arrival times; of particular interest is the scenario with multiplexed SM sources whose aggregate loses the SM property. Unlike existing exact but implicit solutions which rely on numerical transform methods, even in the Mr/M/1 case, our stochastic bounds are in closed-form and shown to be (mostly) numerically accurate.


INTRODUCTION
Martingale-based methods can address many challenging queues in terms of closed-form stochastic bounds; these are numerically accurate especially in heavy-traffic and significantly improve upon alternative bounds obtained using large-deviations or network calculus approaches.Starting with the  //1 analysis by Kingman [16], in terms of exponential bounds, recent studies addressed diverse scenarios with Markovian arrival processes [4,7,9,13,14,21], various scheduling algorithms [23], or multiple-access protocols [24].
In this paper we construct a continuous-time martingale which is structurally similar to existing martingales but is formulated in terms of the counting processes corresponding to arrivals and services, as opposed to using compound arrival processes as in [9].This simple twist enables our main result in queues with batch arrivals, which are a natural abstraction in a variety of scenarios including sensor networks, cloud computing, or even cognitive radio networks [25].Concretely, we provide the first exact result in closed-form for the  X //1 queue, whereby the arrivals follow a renewal process but occur in batches/bulks of a (Geometric) random size  .In the more general case, when  has a light-tailed distribution, our results are expressed as stochastic bounds.We note that the classical  X //1 analysis rests on transform methods; these are numerically challenging even in the   //1 case with deterministic  =  (see § 3).
The second application is in queues with Semi-Markovian arrival processes (SMPs), which jointly generalize renewal and Markov processes: the inter-arrival times not only can have general distributions but can also be correlated.Studying these models is motivated both by measurements (e.g., [5,10,19,30,33]) and analytical studies demonstrating a wide variability in the queue size as a function of the autocorrelation.For a specific queue with Markov-dependent interarrivals and exponential service times, the queue size can grow by as much as a factor of 9 at high utilizations depending on the lag-1 autocorrelation  (i.e., the correlation coefficient between two consecutive inter-arrivals) [31]; the baseline (renewal) case is  = 0. Significantly larger differences have been reported in [22], in the case of a queue with Markov renewals inter-arrivals (the modulating Markov chain has two states only) and exponential inter-arrivals.In the case of high utilizations (around 0.9), the queue size can grow by a factor of 100 by increasing the lag-1 autocorrelation from  = 0.1 to 0.7.Other extraordinary queueing effects when injecting autocorrelation in the interarrival times were reported in [18].
There are several computational/algorithmic methods to exactly analyze SM queues: One of the earliest works used transform methods for //1 [6].Wiener-Hopf factorization was considered in [11] for //1; explicit factorizations are available in special cases, e.g., the //1 queue, subject to some technical conditions (see also [12]).An iterative procedure was proposed in [29] for the / /1 queue.Transform methods are used in [1,3] for the analysis of the //1 queue, additionally subject to correlation between arrivals and service (there is a single Markov chain modulating both arrivals and service times); a related more general model is analyzed in [2] using matrix-analytical methods.Earlier applications of matrix-analytical methods were considered for the / /1 queue in the classical text [20], p. 159.As pointed out in [11], some of these methods are insufficiently understood from a numerical/computational point of view.
Using the newly constructed martingale, the //1 analysis proceeds similarly as for the  X //1 queue.While an identical analysis could be obtained using a (discrete-time) martingale, for suitable random embedding points (see § 4.1), the key advantage of the proposed (continuous-time) martingale is that it also applies to the ΣSM/M/1 queue involving a superposition (the 'Σ') of multiple SM arrivals.The key challenge arising in these systems is that the superposed arrival process is not stationary, let alone SM.The same idea has recently been used for Σ //1 [9] but using a different (continuous-time) martingale, which does however not lend itself to the  X //1 exact solutions herein.
In the following, we first introduce the model and present two key Lemmas for the construction of continuous-time martingales.
Next we analyze the  X //1 and then the //1 and Σ//1 queues, along with some numerical comparisons of the bounds against simulations.Some conclusions and an Appendix including some helpful auxiliary results conclude the paper.

MODEL AND TWO LEMMAS
Jobs arrive in the order  + − 1,  + − 2, . . ., 0 with the convention that job 0 is the last before time 0. For  ≥ 1, the inter-arrival time between jobs  and  − 1 is denoted by   ; job 0 arrives  0 time units before time 0. The service time of job  is denoted by   .The sequences (  )  ∈N and (  )  ∈N∪{0} are stationary and independent of each other.Let ,  ≥ 0, and assume for stability that the intensity  :=   < 1.We are interested in the number of jobs  in the system (queue + service) at time 0, whose steady-state distribution can be written as By changing the subscript in   from  to  −  + 2 this can be rewritten as where () is the number of arrival points within the interval [−, 0) and is the other counting process corresponding to the service times' process.By abuse of notation, let Besides P( ≥ ), we also consider the corresponding Palm distribution conditional of an arrival at time 0, i.e., P  ( ≥ ) := lim where () denotes, with abuse of notation, the number of arrival points in (0, ]; the limit is the right-limit.The Palm probability P  (•) is to be understood as the probability of an event just before an arrival; denote also by E  the corresponding expectation.

Two Lemmas
Our technique relies on the construction of (continuous-time) martingales from the counting processes () and  ().The next two lemmas offer the technical support.
Let (Ω, F , P) be a common probability space and L 1 the class of integrable random variables (i.e, E[| |] < ∞).Let X = (  )  ≥0 be a (continuous-time) stochastic process in L 1 and F  :=  (  , 0 ≤  ≤ ) be the corresponding natural filtration.By definition,   is a (continuous-time) martingale if   ∈ L 1 and Define the auxiliary (random) functions ∀ ≥ 0 Lemma 1.The process   ∈ L 1 is a martingale if and only if it satisfies the following conditions for all  ≥ 0: This result generalizes a particular result from [9] for the construction of martingales from MAPs.
Proof.The necessity of the three properties is trivial: The first two follow from   () =   ∀ ≥ 0; the last follows from   () = 0 ∀ > 0.
For the other direction, fix  ≥ 0 and  > 0. We can write In the first line we used the Tower Property for conditional expectations.In the next we applied () and the Dominated Convergence Theorem, and lastly we applied ().This shows that  ′  () is rightdifferentiable and  ′  () = 0 ∀ ≥ 0. Using the continuity of   () from () it then follows that   () is a constant, i.e., □ Note that the continuity condition () is needed for the last argument, whereas () is needed for the interchange of the limit with the expectation.
We shall use Lemma 1, and more precisely the differentiability condition (), for martingale constructions 1 .Because validating the condition () is exceptionally difficult/tedious, we will prove the martingale property using the next result.Lemma 2. Let   be a Markov process on a state space  with generator (,  ()).Assume that there exist a sequence of stopping times   ↑ ∞,  ∈ N, such that {  ∧  ,  ≥ 0} is a stopped Markov process with state space   ⊆ , for each  ∈ N. Let  :  → R be a measurable function bounded on   , for each  ∈ N. If  (  ) = 0 for all  <   then { (  ∧  ),  ≥ 0} is a martingale, and hence { (  ),  ≥ 0} is a local martingale. 1An example of a stochastic process which does not satisfy () is   = |  |, where   is the standard Brownian motion.Indeed  0 This is an application of Dynkin's Formula (see Appendix § A).This result along with a careful technical argument (see the proofs of Corollaries 3 and 5) are particularly needed because Dynkin's Formula cannot be directly applied to show that the constructed processes  () are martingales, because they are not bounded.
Proof.Fix  ∈ N and let  |   :   → R be a bounded measurable function.Applying Dynkin's Formula for {  ∧  ,  ≥ 0} yields the martingale and hence by using  (  ) = 0 for  <   .The rest is clear.□

GI X /M/1
We can now proceed with the general analysis of the   //1 queue and then consider several special cases.The arrivals are driven by a renewal process; denote by  () and  () the density and distribution, respectively, of the inter-arrivals.The arrivals occur in bulks/batches following the distribution of a (non-negative) discrete random variable  > 0; the arrival rate is thus E[ ] and the service rate is scaled to  := We assume that the batch size is light-tailed, i.e., it admits finite moment generating functions.Concretely, we assume that sup{ > 0 : and also that there exists  > 0 such that This latter condition prevents  from having a 'thin' tail; such a case could be easily treated by our framework by constructing super-martingales rather than martingales, as done next.A certainly interesting case, but not covered by our framework, is when  has a heavy-tailed distribution.
A key element in the structure of   is the random function ℎ() which takes as parameter the 'memory' () at time −.This is needed to compute the probability of an arrival within the interval [−( + ), −) in terms of the corresponding hazard rate, i.e.,  (()) +  () for some small  > 0, where lim →0  ()  = 0. Note that the hazard rate replaces the rate , had the arrival process been a Poisson process.
Proof.Fix  ≥ 0. The expressions for  and ℎ() follow by invoking condition () from Lemma 1, i.e., Expanding the expectation yields lim We note that on an arrival ( + ) = 0, i.e., the residual lifetime resets itself, whence the term ℎ(0) in the first line; we also used the conditional independence of  and .Denoting for convenience () =  and rearranging terms we obtain Setting the initial value problem with ℎ(0) = 1 (note that ℎ() can be arbitrarily scaled) yields 0. This yields the expression of ℎ() and the condition on  from the Corollary.
To show the existence of  we need to show that there exists a solution to ( ) = 1.Using (0) = 1, the assumption from (2), and the stability condition Next, to show that ℎ() is bounded we integrate by parts: Finally, to prove that   is a martingale, let the Markov process   := (() −  (), ()),  ≥ 0 and define for all  ∈ N, with the convention that inf ∅ = ∞, and  ( , ) := ℎ()   , for all  ∈ N and  ≥ 0. Consider the stopped Markov process   ∧  .We have for  ≤  − 1 Therefore  ( , ) = 0 if and only if and where  1 () := inf { ≥ 0 : Then for all  ≥ 1 with  and ℎ() from Corollary 3.
Therefore, the contribution of Theorem 4 are closed-form (and transform-free) bounds on 's distribution for GI X //1; see also below for some cases with exact results.We note that while the parameter  is implicit, it can be obtained using a simple (logarithmictime) binary search.For instance, in the M r //1 case,  satisfies Denoting  =   , this can be rewritten as which has a unique (real) solution when  < 1.Some numerical results will be shown in § 4.2.
3.1 Special Cases: //1,  //1,  Geo //1 When the arrival process is Poisson there is no need for tracking the 'memory' (), due to the memoryless property, and hence ℎ = 1.The martingale process becomes where  := − ln .Remarkably, the bound on P( ≥ ) becomes exact, i.e., P( ≥ ) =   ; the reason is that ( ) −  ( ) =  when the batch size is 1, as a by-product of using counting processes in the representation of   .The exact result is also captured using Kingman's (discrete-time) martingale The  //1 analysis follows by simply letting  = 1.For the same reason as in the //1 case, i.e., ( ) −  ( ) = , the exact result (shown in (7) below) is recovered.We point out that both Kingman's martingale as well as an alternative recent (continuoustime) martingale for  //1 from [9]   = ℎ(())  (  ( ) =1   − ) ∀ ≥ 0 , only yield upper bounds, due to the existence of an overshoot at time  ; here, the martingale is built in terms of a compound process drained at a rate 1 corresponding to the '−' term in the exponent.An exact result was obtained however earlier by Ross [27] using an additional stopping time, a neat conditioning argument, and the memoryless property of the service times.Lastly, the  X //1 analysis when  has a Geometric distribution with parameter , i.e., P( is invariant to  (as a by-product of the memoryless property of  ).Applying Theorem 4 we obtain the exact result for  ≥ 1 This is the same result as for  //1 except for the value of  driven by (3) from Corollary 3. We also note that our solution drastically simplifies the standard  //1 solution (see, e.g., [15], p.259), especially concerning the existence of the unique solution which relies on a much more involved argument based on Rouché's Theorem from complex analysis.While only the existence of  was proven in Corollary 3, we note that in both  //1 and  Geo //1 cases unicity is an immediate by-product of the exact result since lim →∞ P( ≥ ) 1/ =  − .

SM/M/1
A Semi-Markovian Process (SMP), also called Markov Renewal Process or (discrete) Markov Additive Process [4], is defined by the kernel At one extreme, if there is a single state (i.e., |S| = 1), then the point process   (i.e., the collection of points generated by the intervals   ) is a renewal process and hence the //1 queue instantiates to the  //1 queue.At the other extreme, if the sojourn times in all of the states are exponentially distributed (i.e.,   () = 1− −   ∀), then the SMP instantiates to a Markov process.
In an SMP, the sojourn times in the states not only follow general distributions but are not necessarily independent; they are however conditionally independent on the states, e.g., Due to the representation of  from (1), we assume without loss of generality that the Markov chain   is reversible.Otherwise, one would need to construct and work with the corresponding reversed process, and more exactly to replace  and  , by where  is a diagonal matrix with the   's on its main diagonal.We also assume that at time 0 the state is  0 such that the associated SMP is  () =  ( ) , i.e., the state at time −.Its stationary distribution is ( [17], p. 411) The distribution of the sojourn time in state  is denoted by If this distribution is (absolutely) continuous2 , the corresponding density is denoted by   () and the hazard rate by   () := lim Denoting  , () =  ′ , () we also consider the hazard rate  , () := lim .
Proof.Fix  ≥ 0. The Palm distribution proceeds exactly as in the proof of Theorem 4. For the other we need to compute and the rest follows from ( 8) and the construction of  − and ℎ  (0) from Corollary 5. □

ΣSM/M/1
The input is a superposition of two 4 SMP processes.We use the same notation as earlier except for subscripting the parameters accordingly (e.g.,   for the arrival rates of the SMPs); the overall service rate is The fundamental difficulty of such a system is that the superposed input is not stationary, even when the individual inputs are renewals (unless Poisson).To see why our continuous-time martingale is particularly suitable to capture this system, consider the alternative (discrete-time) //1 F   -martingale where   := inf { ≥ 0 : () = } are the arrival points and ℎ  ≡ ℎ  (0) for brevity.The proof follows from ([4], Proposition XI.2.4) by noting that   :=  (  ) is an F   -MAP; it can be shown that this martingale yields the same results as Theorem 6.The key reason why this martingale is not suitable for the superposed process is that the embedding points corresponding to the individual arrivals are different (i.e.,  1   and  2  ).In continuous time, however, multiplexing martingales proceeds by first building two martingales for two fictitious //1 queues with service rates  1 and  2 , respectively, such that  1 +  2 = ; also,   >   for the stability of the two queues.Applying Corollary 5, the two martingales are  , := ℎ ,  ( ) (  ())   (  ( )−  ( )) ,  = 1, 2,  ≥ 0, 4 The general case of multiple processes follow similarly.
where   () is a Poisson process with rate   .Recall that   depends on   , and we write this dependency in terms of the functions  1 ( 1 ) and  2 ( 2 ), which are positive (see the argument about the existence and uniqueness of ' ' from Corollary 5).
Next we show how to find  1 to obtain a suitable martingale for the original ΣSM/M/1 queue from  1, and  2, .First, the stability conditions yield the following margins for the parameter  1 : From the last part of the proof of Corollary 5, involving the existence of  > 0, we obtain that lim Note that when  1 =  1 the first //1 queue becomes unstable and  = 0 would be the only solution of the fixed point equation (9).Similarly, lim From continuity (of the eigenvalue solution), there exists a value Using this value of  1 , the product of the two (independent) martingales  1, and  2, is itself a martingale where  () =  1 () +  2 () is a Poisson process with rate , by the superposition property of Poisson processes.
The key reason for finding  1 to guarantee the same  for the individual martingales  1, and  2, is that we could express the martingale   in terms of  1 () +  2 () −  (); this term drives the queueing process in the original ΣSM/M/1 queue (recall (1)).Therefore, stochastic bounds on  follow immediately as in Theorem 6: and similarly for the lower bound, where   =     ; note the exponent ( − 2), whereby the '−2' stems from the calculations of E[ ,0 ] (see the proof of Theorem 6).Moreover, the additional 'inf  ' (to be replaced by 'sup  ' for the lower bound) are needed because in the case of two arrival streams the stopping time  can happen on an arrival from a single stream only, i.e., either  1 ( ) = 0 or  2 ( ) = 0.
Moving to //1, an intuitive SM but non-renewal process is an alternating renewals (AR) process with states  () ∈ {1, 2}, The correlation structure can be explained in an extreme scenario with small  1 and large  2 : if an arbitrary interarrival is large then the next is likely small, and vice-versa.
Fig. 1.(c-d) illustrate the bounds' accuracy for //1 scenarios, the latter with a Weibull(2,1) distribution with shape 2 and scale 1 (i.e., P( ) alternating with an Exponential with rate  2 .The service rate  depends on the intensity .We consider the bounds P ,2 ( ≥ ), i.e., concerning the queue size just before triggering the Exponential interarrival with rate  2 (the bounds are immediate applications of Theorem 6). 6ig.3 illustrates the accuracy of the //1 bounds from Theorem 6.The underlying Markov chain has two states (see Fig. 2) and the corresponding kernel is detailed in the caption (the distributions are as in the //1 case).We consider two scenarios depending on which distribution (Weibull or Exponential) is triggered much more frequently than the other: the former in (c), given that  1,2 <  2,1 , and the latter in (d).
Lastly, Fig. 4 illustrates the accuracy of the ΣSM/M/1 bounds; the caption details the precise numerical settings.We only consider the upper bounds to highlight their crucial challenge.In scenarios with burstiness, driven by the underlying Markov chains from Fig. 2.(a,b), the 'true' tail of the queue (on a log-scale) is clearly not a straight line.This behavior has already been apparent in the other scenarios (including the  r //1 case) but it is more pronounced in W (2,1) Exp( 2 ) 0.1 0.9 0.1 0.9   The upper bounds are tight at small  (i.e.,  = 1); however, as they have an exact asymptotic rate (the ' ') and their prefactor is independent of , they follow a straight line which inevitably deviates from the real behavior.In some scenarios, depending on burstiness, the lower bounds which also follow straight lines become tight in the tail (see Fig. The crucial and arguably remaining challenge for the bounds' accuracy is to properly capture the initial 'bend' characteristic to the 'true' behavior of the tail.From a technical point of view, the open problem is to more precisely characterize where  is the recurring stopping time from all the proofs.In the  //1 case, in which there is a single state for the underlying Markov chain, whereas  happens on an arrival (i.e., ( ) = 0), exact results could be obtained; as a side remark, the assumption of exponential service times is also crucial for the exact results, as there is no need for an additional random function 'ℎ  ()' to capture the remaining lifetime in the service process due to the memoryless property.In the //1 or //1 cases, in which the Markov chains can have more than one state, or in the ΣSM/M/1 case, in which there is more than a single SMP, the current technique only uses 'rough' and deterministic bounds.In particular, the deterministic nature of these bounds is sufficient to break the above expectation and capture the key metric E [1l  <∞ ] = P( ≥ ).

CONCLUSIONS
We have developed a methodology to construct a wide class of continuous-time martingales, which were used to derive stochastic bounds in several practically motivated queueing systems.The overall methodology is not only an intuitive and much simpler alternative to rediscover the classical  //1 exact result, and also to obtain the first exact result in closed-form for  Geo //1, but it enables a modular treatment of significantly more complex queues by manipulating martingales from simpler queues.Several extensions to other even more complex queueing systems are possible, e.g.,  X //1 or Σ//1.Moreover, our queueing results retain the expressiveness of the exact  //1 result, are asymptotically optimal, and are also computationally light -essentially involving an eigenvalue problem in the number of states |S| and binary searches.Further improving the bounds' accuracy, including capturing the non-exponential initial behavior (e.g., for  r //1 or Σ//1), remains an open problem.Definition 8.A real value adapted process  is of class DL if for every  > 0 the family of random variables   , where  ranges through all stopping times satisfying P( ≤ ) = 1, is uniformly integrable.
Theorem 9 (e.g., [26], p. 124).A local martingale is a martingale if and only if it is of class DL.
Theorem 10 (Dynkin's Formula (e.g., [32] Next we provide some useful variations of known results concerning the process   = ℎ()  (( )− ( )) , which plays the role of a martingale for the //1 queue.() an  () are counting processes associated to stationary interarrival and service times, respectively.The function ℎ() ≡ ℎ  ( ) (()) was shown to be bounded from above, where  () is the Semi-Markov Process and () is the remaining lifetime corresponding to the arrivals.Recall first the definition of  := inf { : () −  () =  }.This result enables the parallel derivation of both upper and lower bounds.It first appeared in [9] in a less general setting.We present a (simpler) proof for completeness.
Proof.If the interarrivals   form a renewal process then as  → ∞ (see, e.g., [28], p. 102) () , as an immediate consequence of the Strong Law of Large Numbers.The result extends immediately to the (stationary) SMP case, using Birkhoff's Ergodic Theorem.Similarly, Proof.Fix  > 0 and  ≥ 0, and assume without loss of generality that there is an arrival at time 0. For clarity, we first give the proof in the renewal case.We can write Finally,   ≥  , implies that () ≤   () and therefore the MGF of () is also bounded.
Let now the counting processes The first inequality follows from the choice of  1 :  1 , is stochastically smaller than  , for all .The second follows from   , ≤    .Therefore, the MGF of () is bounded by the MGF of  1  (), which is subject to a renewal structure.The rest follows as in the renewal case by properly choosing .□ E[ ]  .Let the moment generating function (MGF)   ( ) := ∞ =1 P( = )  , for some  > 0. Let also  () :=  ( ) 1− ( ) be the hazard rate of  1 and ( ) :=   − 1 be the corresponding Laplace transform, where  :=  (1 −  − ).Denote also by () the remaining lifetime of the arrival process at time −; recall that we are interested in the steady-state queue size  = sup  ≥0 {() −  ()} at time 0.

Figure 2 :
Figure2: The SMP processes used for the numerical evaluations of the //1 and ΣSM/M/1 bounds; the state label denotes the distribution of the sojourn time in that particular state;  (2, 1) stands for Weibull distribution with shape 2 and scale 1; Exp( 2 ) stands for exponential distribution with rate  2 ; according to the transition probabilities,  (2, 1) is the dominating distribution (i.e., triggered more frequently) in (a), whereas Exp( 2 ) is the dominating one in (b).

Figure 4 :
Figure 4: ΣSM/M/1: bounds on the CCDF P( ≥ ); the superposed SMPs are those from Fig. 2.(a,b);  = 0.9 3.(b)).In others, they are consistently loose as apparent from Fig. 3.(a), which is subject to the SMP from Fig. 2.(a) dominated by the Weibull distribution; the same holds in the ΣSM/M/1 case, as it accounts for the same SMP from Fig. 2.(a).