A Time Delay Estimator Based on the Signal Integral: Theoretical Performance and Testing on ECG Signals
Abstract:
We present a theoretical and experimental performance study of a method for time delay estimation (TDE), based on the signal integral (TDE-SI). The TDE-SI method considers the delay between two transient signals as the difference between the center of mass of these signals. The method has three special cases: In the first, time is the mass coordinate and the signal s(t) is the mass distribution (estimate Ďs); in the second case, the squared signal s2(f) is the mass distribution (estimate Ďs2); and the last is a variant of the second. The bias and the standard deviation (σ) of the estimate have been evaluated when the signal is contaminated by Gaussian white noise. Ďs is not biased but the a of the TDE is higher than that obtained when working with Ďs2. Moreover, the Ďs2 estimate is biased. The special case of a bias-corrected estimate (Ď’s2) is presented; this Ď’s2 yields a a of its TDE lower than the estimate Ďs. Hence, Ď’s2 is the most suitable of the three TDE-SI options for TDE. Theoretical estimations are validated by simulation results with artificially generated signals and by real so-called QRS complex waves (ventricular activity) from an electrocardiographic (ECG) signal. © 1994 IEEE
Año de publicación:
1994
Keywords:
Fuente:

Tipo de documento:
Article
Estado:
Acceso restringido
Áreas de conocimiento:
- Algoritmo
- Procesamiento de señales
Áreas temáticas:
- Ciencias de la computación