Recursive least squares approach to combining principal and minor component analyses

Access Full Text

Recursive least squares approach to combining principal and minor component analyses

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

A novel approach for high-performance data compression using neural networks is proposed. After the principal components of the input vectors are extracted, the error covariance matrix obtained in the recursive least square training process is used to perform minor components pruning so that a higher compression ratio is achieved. Simulation results show that our method effectively combines principal and minor component analyses.

Inspec keywords: image reconstruction; neural nets; data compression; least squares approximations; covariance matrices; image coding

Other keywords: minor component analyses; principal component analyses; neural networks; recursive least squares approach; data compression; training process; compression ratio; error covariance matrix; minor components pruning; input vectors

Subjects: Neural nets (theory); Interpolation and function approximation (numerical analysis); Interpolation and function approximation (numerical analysis); Information theory; Codes

References

    1. 1)
      • Y. Iiguni , H. Sakai . A real-time learning algorithm for amultilayered neural network based on the extended Kalman filter. IEEETrans. Signal Process. , 959 - 966
    2. 2)
      • T. Chen . Modified Oja's algorithms for principal subspace and minorsubspace extraction. Neural Process. Lett. , 105 - 110
    3. 3)
      • S. Bannour , M.R. Azimi-Sadjadi . Principal componentextraction using recursive least squares learning. IEEE Trans. NeuralNetw. , 457 - 469
    4. 4)
      • C.S. Leung . On-line trainingand pruning for the recursive least square algorithms. Electron. Lett. , 2152 - 2153
    5. 5)
      • L. Wang , J. Karhunen . Unified neural bigradient algorithm forrobust PCA and MCA. Int. J. Neural Syst. , 53 - 67
http://iet.metastore.ingenta.com/content/journals/10.1049/el_19980765
Loading

Related content

content/journals/10.1049/el_19980765
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading