Simulating M/M/1 queue in MATLAB
Posted by Erwin on February 23, 2013
M/M/1 can be modeled in MATLAB using Discrete Event simulation.
The arrival rate is and the service time is .
The interarrival times and the services times are exponentially distributed. A vector of exponentially distributed interarrival and service times are generated.
To generate exponential random vector, a transformation on uniform random variables can be used. To generate an exponentially distributed random variable with parameter , we need to convert the expression
This is equivalent to , since is uniformly distributed between 0 and 1
%generate two vector of random numbers with uniformly distribution
seed1 = RandStream.create(‘mcg16807′,’Seed’,5);
U1 = rand(seed1, 1, NSamples);
seed2 = RandStream.create(‘mcg16807′,’Seed’,6);
U2 = rand(seed2, 1, NSamples);
% service time vector
S = -1/mu*log(u1) * 1E3;
% inter-arrival time vector
tau = -1/lambda*log(u2) * 1E3;
Arrival and Departure Times
The arrival time is Poisson distributed and it is the sums of cumulative sum of the exponential random variables.
The Departure time for the first customer is
The departure times for subsequent customers is
The expected delay is
% the arrival time vector
T = cumsum(tau);
% the departure time vector
D = zeros(NSamples, 1);
D(1) = S(1) + tau(1);
% the waiting time vector
W = zeros(NSamples, 1);
W(1) = 0;
for i = 2:NSamples
D(i) = max(T(i), D(i-1)) + S(i);
W(i) = max(D(i-1)-T(i), 0);
% the delay time vector – in–queue wait time and service time
Delay_Time = zeros(NSamples, 1);
for i = 1:NSamples,
Delay_Time(i) = W(i) + S(i);
W is a vector that contains the waiting time.
Delay_Time is a vector that contains the time spent by a packet in the system
see more : here