The performance of the cross-correlation technique for time-delay estimation in the presence of jitter is discussed. It is shown that the effect of a sampling jitter in the signal to be correlated with a reference template is the convolution of the jitter density function with the original cross-correlation function in the absence of jitter. A simulation of the approach, showing how the error caused by the jitter can be removed using a simple deconvolution process, is presented. A practical application of the approach in a rotation measurement algorithm is given.