Bearing failures are a major source of problem in rotating machines. These faults appear
as impulses at periodic intervals resulting in form of specific characteristic frequencies. However,
the characteristic frequencies are submerged in noise causing by a result of small imperfections in
the balance or smoothness of the components of the bearing. To retrieve the characteristic fault
frequencies of the vibration signal, signal denoising is an essential processing step in fault
diagnosis of the bearings. This paper presents time-frequency analysis and nonlinear manifold
learning technique for denoising vibration signals corrupted by additive white Gaussian noise.
According to keeping the computing time acceptable, a novel manifold learning denoising method
is put forward combining data compression and reconstruct operations. Simulation and
experiments are employed to verify the feasibility and effectiveness of the proposed method on
bearing vibration signals. Furthermore, this method can be used in other fault detection fields,
such as engine, suspension device, and vehicle structures.
Keywords: bearing,
Restricted to Repository staff only
Download (4MB)
Restricted to Repository staff only
Download (4MB)