101
|
Al-Fahoum AS. Quality Assessment of ECG Compression Techniques Using a Wavelet-Based Diagnostic Measure. ACTA ACUST UNITED AC 2006; 10:182-91. [PMID: 16445263 DOI: 10.1109/titb.2005.855554] [Citation(s) in RCA: 117] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Electrocardiograph (ECG) compression techniques are gaining momentum due to the huge database requirements and wide band communication channels needed to maintain high quality ECG transmission. Advances in computer software and hardware enable the birth of new techniques in ECG compression, aiming at high compression rates. In general, most of the introduced ECG compression techniques depend on their evaluation performance on either inaccurate measures or measures targeting random behavior of error. In this paper, a new wavelet-based quality measure is proposed. A new wavelet-based quality measure is proposed. The new approach is based on decomposing the segment of interest into frequency bands where a weighted score is given to the band depending on its dynamic range and its diagnostic significance. A performance evaluation of the measure is conducted quantitatively and qualitatively. Comparative results with existing quality measures show that the new measure is insensitive to error variation, is accurate, and correlates very well with subjective tests.
Collapse
Affiliation(s)
- Amjed S Al-Fahoum
- Electronic Engineering Department, Yarmouk University, Irbid, Jordan.
| |
Collapse
|
102
|
|
103
|
Abstract
An electrocardiogram (ECG) data compression scheme is presented using the gain-shape vector quantization. The proposed approach utilizes the fact that ECG signals generally show redundancy among adjacent heartbeats and adjacent samples. An ECG signal is QRS detected and segmented according to the detected fiducial points. The segmented heartbeats are vector quantized, and the residual signals are calculated and encoded using the AREA algorithm. The experimental results show that with the proposed method both visual quality and the objective quality are excellent even in low bit rates. An average PRD of 5.97% at 127 b/s is obtained for the entire 48 records in the MIT-BIH database. The proposed method also outperforms others for the same test dataset.
Collapse
Affiliation(s)
- Chia-Chun Sun
- National Cheng Kung University, Tainan, Taiwan, ROC.
| | | |
Collapse
|
104
|
Blanco-Velasco M, Cruz-Roldán F, Godino-Llorente JI, Blanco-Velasco J, Armiens-Aparicio C, López-Ferreras F. On the use of PRD and CR parameters for ECG compression. Med Eng Phys 2005; 27:798-802. [PMID: 15869896 DOI: 10.1016/j.medengphy.2005.02.007] [Citation(s) in RCA: 55] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2004] [Revised: 01/21/2005] [Accepted: 02/22/2005] [Indexed: 11/30/2022]
Abstract
The quality measurement of the reconstructed signal in an electrocardiogram (ECG) compression scheme must be obtained by objective means being the percentage root-mean-square difference (PRD) the most widely used. However, this parameter is dependent on the dc level so that confusion can be stated in the evaluation of ECG compressors. In this communication, it will be shown that if the performance of an ECG coder is evaluated only in terms of quality, considering exclusively the PRD parameter, incorrect conclusions can be inferred. The objective of this work is to propose the joint use of several parameters, as simulations will show, effectiveness and performance of the ECG coder are evaluated with more precision, and the way of inferring conclusions from the obtained results is more reliable.
Collapse
Affiliation(s)
- Manuel Blanco-Velasco
- Dep. Teoría de la Señal y Comunicaciones, Escuela Politécnica, Universidad de Alcalá, Alcalá de Henares (Madrid), Spain.
| | | | | | | | | | | |
Collapse
|
105
|
Abstract
The wavelet transform has emerged over recent years as a powerful time-frequency analysis and signal coding tool favoured for the interrogation of complex nonstationary signals. Its application to biosignal processing has been at the forefront of these developments where it has been found particularly useful in the study of these, often problematic, signals: none more so than the ECG. In this review, the emerging role of the wavelet transform in the interrogation of the ECG is discussed in detail, where both the continuous and the discrete transform are considered in turn.
Collapse
Affiliation(s)
- Paul S Addison
- CardioDigital Ltd, Elvingston Science Centre, East Lothian, UK.
| |
Collapse
|
106
|
Tai SC, Sun CC, Yan WC. A 2-D ECG Compression Method Based on Wavelet Transform and Modified SPIHT. IEEE Trans Biomed Eng 2005; 52:999-1008. [PMID: 15977730 DOI: 10.1109/tbme.2005.846727] [Citation(s) in RCA: 91] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
A two-dimensional (2-D) wavelet-based electrocardiogram (ECG) data compression method is presented which employs a modified set partitioning in hierarchical trees (SPIHT) algorithm. This modified SPIHT algorithm utilizes further the redundancy among medium- and high-frequency subbands of the wavelet coefficients and the proposed 2-D approach utilizes the fact that ECG signals generally show redundancy between adjacent beats and between adjacent samples. An ECG signal is cut and aligned to form a 2-D data array, and then 2-D wavelet transform and the modified SPIHT can be applied. Records selected from the MIT-BIH arrhythmia database are tested. The experimental results show that the proposed method achieves high compression ratio with relatively low distortion and is effective for various kinds of ECG morphologies.
Collapse
Affiliation(s)
- Shen-Chuan Tai
- Department of Electrical Engineering, National Cheng Kung University, Tainan 701, Taiwan, R.O.C.
| | | | | |
Collapse
|
107
|
Miaou SG, Chao SN. Wavelet-based lossy-to-lossless ECG compression in a unified vector quantization framework. IEEE Trans Biomed Eng 2005; 52:539-43. [PMID: 15759584 DOI: 10.1109/tbme.2004.842791] [Citation(s) in RCA: 67] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
In a prior work, a wavelet-based vector quantization (VQ) approach was proposed to perform lossy compression of electrocardiogram (ECG) signals. In this paper, we investigate and fix its coding inefficiency problem in lossless compression and extend it to allow both lossy and lossless compression in a unified coding framework. The well-known 9/7 filters and 5/3 integer filters are used to implement the wavelet transform (WT) for lossy and lossless compression, respectively. The codebook updating mechanism, originally designed for lossy compression, is modified to allow lossless compression as well. In addition, a new and cost-effective coding strategy is proposed to enhance the coding efficiency of set partitioning in hierarchical tree (SPIHT) at the less significant bit representation of a WT coefficient. ECG records from the MIT/BIH Arrhythmia and European ST-T Databases are selected as test data. In terms of the coding efficiency for lossless compression, experimental results show that the proposed codec improves the direct SPIHT approach and the prior work by about 33% and 26%, respectively.
Collapse
Affiliation(s)
- Shaou-Gang Miaou
- Multimedia Computing and Telecommunications Laboratory, Department of Electronic Engineering, Chung Yuan Christian University, Chung-Li, 32023 Taiwan, ROC.
| | | |
Collapse
|
108
|
Blanco-Velasco M, Cruz-Roldán F, López-Ferreras F, Bravo-Santos A, Martínez-Muñoz D. A low computational complexity algorithm for ECG signal compression. Med Eng Phys 2004; 26:553-568. [PMID: 15271283 DOI: 10.1016/j.medengphy.2004.04.004] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2003] [Revised: 02/06/2004] [Accepted: 04/20/2004] [Indexed: 10/26/2022]
Abstract
In this work, a filter bank-based algorithm for electrocardiogram (ECG) signals compression is proposed. The new coder consists of three different stages. In the first one--the subband decomposition stage--we compare the performance of a nearly perfect reconstruction (N-PR) cosine-modulated filter bank with the wavelet packet (WP) technique. Both schemes use the same coding algorithm, thus permitting an effective comparison. The target of the comparison is the quality of the reconstructed signal, which must remain within predetermined accuracy limits. We employ the most widely used quality criterion for the compressed ECG: the percentage root-mean-square difference (PRD). It is complemented by means of the maximum amplitude error (MAX). The tests have been done for the 12 principal cardiac leads, and the amount of compression is evaluated by means of the mean number of bits per sample (MBPS) and the compression ratio (CR). The implementation cost for both the filter bank and the WP technique has also been studied. The results show that the N-PR cosine-modulated filter bank method outperforms the WP technique in both quality and efficiency.
Collapse
Affiliation(s)
- Manuel Blanco-Velasco
- Dep. Teoría de la Señal y Comunicaciones, Escuela Politécnica, Universidad de Alcalá, Alcalá de Henares, Madrid, Spain.
| | | | | | | | | |
Collapse
|
109
|
Norris JA, Englehart KB, Lovely DF. Myoelectric signal compression using zero-trees of wavelet coefficients. Med Eng Phys 2003; 25:739-46. [PMID: 14519346 DOI: 10.1016/s1350-4533(03)00118-8] [Citation(s) in RCA: 18] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Recent progress in the diagnostic use of the myoelectric signal for neuromuscular diseases, coupled with increasing interests in telemedicine applications, mandate the need for an effective compression technique. The efficacy of the embedded zero-tree wavelet compression algorithm is examined with respect to some important analysis parameters (the length of the analysis segment and wavelet type) and measurement conditions (muscle type and contraction type). It is shown that compression performance improves with segment length, and that good choices of wavelet type include the Meyer wavelet and the fifth order biorthogonal wavelet. The effects of different muscle sites and contraction types on compression performance are less conclusive.A comparison of a number of lossy compression techniques has revealed that the EZW algorithm exhibits superior performance to a hard thresholding wavelet approach, but falls short of adaptive differential pulse code modulation. The bit prioritization capability of the EZW algorithm allows one to specify the compression factor online, making it an appealing technique for streaming data applications, as often encountered in telemedicine.
Collapse
Affiliation(s)
- Jason A Norris
- Institute of Biomedical Engineering, University of New Brunswick, 25 Dineen Drive, NB, E3B 5A3, Fredericton, Canada
| | | | | |
Collapse
|
110
|
Alshamali A, Al-Fahoum AS. Comments on "an efficient coding algorithm for the compression of ECG signals using the wavelet transform". IEEE Trans Biomed Eng 2003; 50:1034-7. [PMID: 12892331 DOI: 10.1109/tbme.2003.814531] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The author proposed an effective wavelet-based ECG compression algorithm (Rajoub, 2002). The reported extraordinary performance motivated us to explore the findings and to use it in our research activity. During the implementation of the proposed algorithm several important points regarding accuracy, methodology, and coding were found to be improperly substantiated. This paper discusses these findings and provides specific subjective and objective measures that could improve the interpretation of compression results in these research-type problems.
Collapse
Affiliation(s)
- Ahmad Alshamali
- Hijjawi Faculty for Engineering Technology, Yarmouk University, Irbid 21163, Jordan.
| | | |
Collapse
|
111
|
Xiong Z, Wu X, Cheng S, Hua J. Lossy-to-lossless compression of medical volumetric data using three-dimensional integer wavelet transforms. IEEE TRANSACTIONS ON MEDICAL IMAGING 2003; 22:459-470. [PMID: 12760561 DOI: 10.1109/tmi.2003.809585] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
We study lossy-to-lossless compression of medical volumetric data using three-dimensional (3-D) integer wavelet transforms. To achieve good lossy coding performance, it is important to have transforms that are unitary. In addition to the lifting approach, we first introduce a general 3-D integer wavelet packet transform structure that allows implicit bit shifting of wavelet coefficients to approximate a 3-D unitary transformation. We then focus on context modeling for efficient arithmetic coding of wavelet coefficients. Two state-of-the-art 3-D wavelet video coding techniques, namely, 3-D set partitioning in hierarchical trees (Kim et al., 2000) and 3-D embedded subband coding with optimal truncation (Xu et al., 2001), are modified and applied to compression of medical volumetric data, achieving the best performance published so far in the literature-both in terms of lossy and lossless compression.
Collapse
Affiliation(s)
- Zixiang Xiong
- Department of Electrical Engineering, Texas A&M University, College Station, TX 77843, USA.
| | | | | | | |
Collapse
|
112
|
Hwang WJ, Chine CF, Li KJ. Scalable medical data compression and transmission using wavelet transform for telemedicine applications. IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE : A PUBLICATION OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY 2003; 7:54-63. [PMID: 12670019 DOI: 10.1109/titb.2003.808499] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
In this paper, a novel medical data compression algorithm, termed layered set partitioning in hierarchical trees (LSPIHT) algorithm, is presented for telemedicine applications. In the LSPIHT, the encoded bit streams are divided into a number of layers for transmission and reconstruction. Starting from the base layer, by accumulating bit streams up to different enhancement layers, we can reconstruct medical data with various signal-to-noise ratios (SNRs) and/or resolutions. Receivers with distinct specifications can then share the same source encoder to reduce the complexity of telecommunication networks for telemedicine applications. Numerical results show that, besides having low network complexity, the LSPIHT attains better rate-distortion performance as compared with other algorithms for encoding medical data.
Collapse
Affiliation(s)
- Wen-Jyi Hwang
- Department of Electrical Engineering, Chung Yuan Christian University, Chungli, 32023, Taiwan.
| | | | | |
Collapse
|
113
|
Dutt DN, Krishnan SM, Srinivasan N. A dynamic nonlinear time domain model for reconstruction and compression of cardiovascular signals with application to telemedicine. Comput Biol Med 2003; 33:45-63. [PMID: 12485629 DOI: 10.1016/s0010-4825(02)00058-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
A new nonlinear time domain model is proposed in this paper for signals of cardiovascular origin. An equation of the dynamic nonlinear model has been obtained by considering a masking function, which is modulated by a harmonic series with the baseline drift incorporated into the model. Signal reconstruction using model parameters has established the effectiveness of the model for signal compression. Improvement has been effected by using neural networks for reducing the time for optimizing the initial parameters. An improved adaptive optimization step size algorithm has also been implemented. Results show that the technique is able to provide reasonable compression with low error between the original and reconstructed signals. One of the main advantages of the model is its potential of being used for compression of many different types of biosignals transmitted in parallel. Incorporation of the compression model into a telemedicine system has led to considerable saving in transmission time for patient data.
Collapse
Affiliation(s)
- D Narayana Dutt
- Biomedical Engineering Research Centre, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 639798, Singapore
| | | | | |
Collapse
|
114
|
Miaou SG, Yen HL, Lin CL. Wavelet-based ECG compression using dynamic vector quantization with tree codevectors in single codebook. IEEE Trans Biomed Eng 2002; 49:671-80. [PMID: 12083301 DOI: 10.1109/tbme.2002.1010850] [Citation(s) in RCA: 74] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
In this paper, we propose a novel vector quantizer (VQ) in the wavelet domain for the compression of electrocardiogram (ECG) signals. A vector called tree vector (TV) is formed first in a novel structure, where wavelet transformed (WT) coefficients in the vector are arranged in the order of a hierarchical tree. Then, the TVs extracted from various WT subbands are collected in one single codebook. This feature is an advantage over traditional WT-VQ methods, where multiple codebooks are needed and are usually designed separately because numerical ranges of coefficient values in various WT subbands are quite different. Finally, a distortion-constrained codebook replenishment mechanism is incorporated into the VQ, where codevectors can be updated dynamically, to guarantee reliable quality of reconstructed ECG waveforms. With the proposed approach both visual quality and the objective quality in terms of the percent of root-mean-square difference (PRD) are excellent even in a very low bit rate. For the entire 48 records of Lead II ECG data in the MIT/BIH database, an average PRD of 7.3% at 146 b/s is obtained. For the same test data under consideration, the proposed method outperforms many recently published ones, including the best one known as the set partitioning in hierarchical trees.
Collapse
Affiliation(s)
- Shaou-Gang Miaou
- Department of Electronic Engineering, Chung Yuan Christian University, Chung-Li, Taiwan, ROC.
| | | | | |
Collapse
|
115
|
Abo-Zahhad M, Rajoub BA. An effective coding technique for the compression of one-dimensional signals using wavelet transforms. Med Eng Phys 2002; 24:185-99. [PMID: 12062177 DOI: 10.1016/s1350-4533(02)00004-8] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
This paper introduces an effective technique for the compression of one-dimensional signals using wavelet transforms. It is based on generating a binary stream of 1s and 0s that encodes the wavelet coefficients structure (i.e., encodes the locations of zero and nonzero coefficients). A new coding algorithm, similar to the run length encoding, has been developed for the compression of the binary stream. The compression performances of the technique are measured using compression ratio (CR) and percent root-mean square difference (PRD) measures. To assess the technique properly we have evaluated the effect of signal length, threshold levels selection and wavelet filters on the quality of the reconstructed signal. The effect of finite word length representation on the compression ratio and PRD is also discussed. The technique is tested for the compression of normal and abnormal electrocardiogram (ECG) signals. The performance parameters of the proposed coding algorithm are measured and compression ratios of 19:1 and 45:1 with PRDs of 1% and 2.8% are achieved, respectively. At the receiver end, the received signal is decoded and inverse transformed before being processed. Finally, the merits and demerits of the technique are discussed.
Collapse
Affiliation(s)
- Mohammed Abo-Zahhad
- Electronics Engineering Department, Hijjawi Faculty for Engineering Technology, Yarmouk University, Irbid, Jordan.
| | | |
Collapse
|
116
|
Rajoub BA. An efficient coding algorithm for the compression of ECG signals using the wavelet transform. IEEE Trans Biomed Eng 2002; 49:355-62. [PMID: 11942727 DOI: 10.1109/10.991163] [Citation(s) in RCA: 167] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
A wavelet-based electrocardiogram (ECG) data compression algorithm is proposed in this paper. The ECG signal is first preprocessed, the discrete wavelet transform (DWT) is then applied to the preprocessed signal. Preprocessing guarantees that the magnitudes of the wavelet coefficients be less than one, and reduces the reconstruction errors near both ends of the compressed signal. The DWT coefficients are divided into three groups, each group is thresholded using a threshold based on a desired energy packing efficiency. A binary significance map is then generated by scanning the wavelet decomposition coefficients and outputting a binary one if the scanned coefficient is significant, and a binary zero if it is insignificant. Compression is achieved by 1) using a variable length code based on run length encoding to compress the significance map and 2) using direct binary representation for representing the significant coefficients. The ability of the coding algorithm to compress ECG signals is investigated, the results were obtained by compressing and decompressing the test signals. The proposed algorithm is compared with direct-based and wavelet-based compression algorithms and showed superior performance. A compression ratio of 24:1 was achieved for MIT-BIH record 117 with a percent root mean square difference as low as 1.08%.
Collapse
Affiliation(s)
- Bashar A Rajoub
- Department of Electrical and Communications Engineering, Yarmouk University, Irbid, Jordan.
| |
Collapse
|
117
|
Miaou SG, Lin CL. A quality-on-demand algorithm for wavelet-based compression of electrocardiogram signals. IEEE Trans Biomed Eng 2002; 49:233-9. [PMID: 11876287 DOI: 10.1109/10.983457] [Citation(s) in RCA: 60] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
For the compression of medical signals such as electrocardiogram (ECG), excellent reconstruction quality of a highly compressed signal can be obtained by using a wavelet-based approach. The most widely used objective quality criterion for the compressed ECG is called the percent of root-mean-square difference (PRD). In this paper, given a user-specified PRD, an algorithm is proposed to meet the PRD demand by searching for an appropriate bit rate in an automatic, smooth, and fast manner for the wavelet-based compression. The bit rate searching is modeled as a root-finding problem for a one-dimensional function, where an unknown rate-distortion curve represents the function and the desired rate is the root to be sought. A solution derived from root-finding methods in numerical analysis is proposed. The proposed solution is incorporated in a well-known wavelet-based coding strategy called set partitioning in hierarchical trees. ECG signals taken from the MIT/BIH database are tested, and excellent results in terms of convergence speed, quality variation, and coding performance are obtained.
Collapse
Affiliation(s)
- Shaou-Gang Miaou
- Department of Electronic Engineering, Chung Yuan Christian University, Chung-Li, Taiwan, ROC.
| | | |
Collapse
|
118
|
Wei JJ, Chang CJ, Chou NK, Jan GJ. ECG data compression using truncated singular value decomposition. IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE : A PUBLICATION OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY 2001; 5:290-9. [PMID: 11759835 DOI: 10.1109/4233.966104] [Citation(s) in RCA: 52] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal informations are mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that truncated SVD method can provide an efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission. Index Terms-Data compression, electrocardiogram, feature extraction, quasi-periodic signal, singular value decomposition.
Collapse
Affiliation(s)
- J J Wei
- Department of Electrical Engineering, National Taiwan University, Taipei 10617, Taiwan, ROC.
| | | | | | | |
Collapse
|
119
|
Batista LV, Melcher EU, Carvalho LC. Compression of ECG signals by optimized quantization of discrete cosine transform coefficients. Med Eng Phys 2001; 23:127-34. [PMID: 11413065 DOI: 10.1016/s1350-4533(01)00030-3] [Citation(s) in RCA: 77] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
This paper presents an ECG compressor based on optimized quantization of Discrete Cosine Transform (DCT) coefficients. The ECG to be compressed is partitioned in blocks of fixed size, and each DCT block is quantized using a quantization vector and a threshold vector that are specifically defined for each signal. These vectors are defined, via Lagrange multipliers, so that the estimated entropy is minimized for a given distortion in the reconstructed signal. The optimization method presented in this paper is an adaptation for ECG of a technique previously used for image compression. In the last step of the compressor here proposed, the quantized coefficients are coded by an arithmetic coder. The Percent Root-Mean-Square Difference (PRD) was adopted as a measure of the distortion introduced by the compressor. To assess the performance of the proposed compressor, 2-minute sections of all 96 records of the MIT-BIH Arrhythmia Database were compressed at different PRD values, and the corresponding compression ratios were computed. We also present traces of test signals before and after the compression/decompression process. The results show that the proposed method achieves good compression ratios (CR) with excellent reconstruction quality. An average CR of 9.3:1 is achieved for PRD equal to 2.5%. Experiments with ECG records used in other results from the literature revealed that the proposed method compares favorably with various classical and state-of-the-art ECG compressors.
Collapse
Affiliation(s)
- L V Batista
- COPELE, Federal University of Paraiba, Av. Aprigio Veloso, 882-Bodocongo, 58.109-970, Campina Grande, PB, Brazil.
| | | | | |
Collapse
|