next up previous contents
Next: Summary Up: Time Stability Quantities Previous: Continuous TDEV

MTIE

The MTIE is a measure of the maximum time interval error over a measurement period $\tau$. MTIE is considered useful to capture the phase transients in a timing signal since it describes the maximum phase variation of timing over a time period. It is because of this that it is unable to show the underlying noise characteristics of a signal, hence the use of TDEV, which is essentially an RMS rather than a peak estimator It is defined to be

\begin{displaymath}
MTIE(\tau)=\max_{-\infty \leq t_{0} \leq \infty} \left (\max...
 ...u} [x(t)] - \min_{t_{0} \leq t \leq t_{0} +\tau} [x(t)] \right)\end{displaymath}

This can be estimated by  
 \begin{displaymath}
MTIE(n\tau_{0})=\max_{1 \leq k \leq N-n} \left(\max_{k \leq ...
 ...\min_{k \leq i \leq k+n} x(i)\right), \qquad n =1,2,\ldots,N-1.\end{displaymath} (32)
It was this equation that was used to calculate MTIE. As with TDEV, MTIE can also be expressed [4] in terms of the power spectral density of the time error. It is statistically $2\sigma$ estimated as four times the corresponding Time Interval Error standard deviation (TIErms). This means that 95% of MTIE values lie below the following estimated value:  
 \begin{displaymath}
MTIE(\tau)\approx 4\sqrt{4\int_0^{\infty}S_x(f)\sin^2(\pi f \tau)df}\end{displaymath} (33)


Mark J Ivens
11/13/1997