Document Type : Research Paper

Authors

Department of Engineering Sciences, University of Tehran

10.22059/jac.2024.370994.1209

Abstract

Noise is a part of data whether the data is from measurement or experiment. There are a few techniques for fault detection and reduction to improve the data quality in recent years some of which are based on wavelet, orthogonalization and neural networks. The computational cost of existing methods are more than expected and that's why their application in some cases is not beneficial. In this method, we suggest a tridiagonal model which describes the noise as a function of surrounding signal elements. To make the predicted noise more reliable, the algorithm is equipped with a learning/feedback approach. Our algorithm is used for both small and large noise values. Although the presented numerical results confirm the superlinear convergence of the proposed algorithm, we could only prove the linear convergence. The numerical results confirm the efficiency of presented algorithm in most cases in comparison with orthogonalization based method introduced in 2015.

Keywords