http://iet.metastore.ingenta.com
1887

Adaptive model selection for polynomial NARX models

Adaptive model selection for polynomial NARX models

For access to this article, please select a purchase option:

Buy article PDF
$19.95
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Control Theory & Applications — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Two algorithms are proposed for the adaptive model selection of polynomial non-linear autoregressive with exogenous variable (NARX) models. The recursive forward regression with pruning (RFRP) algorithm is based on a recursive orthogonal least-squares (ROLS) procedure and efficiently integrates model augmentation and pruning to reduce processing time whenever new data are available. The algorithm provides excellent model structure tracking compared to different OLS-based model selection policies. A less accurate but much faster algorithm that can be used for time-critical applications is the ROLS-LASSO. This algorithm uses a recursive version of the least absolute shrinkage and selection operator (LASSO) regularisation approach for structure selection. It features a recursive standardisation of the regressors and performs parameter estimation with ROLS. A sliding window data updating is here adopted for both algorithms, although the methods seamlessly generalise to exponential windowing with forgetting factor. Some simulation examples are provided to demonstrate the model tracking capabilities of the algorithms.

References

    1. 1)
    2. 2)
    3. 3)
    4. 4)
    5. 5)
    6. 6)
    7. 7)
    8. 8)
    9. 9)
    10. 10)
    11. 11)
    12. 12)
    13. 13)
    14. 14)
    15. 15)
      • A. Miller . (2002) Subset selection in regression.
    16. 16)
    17. 17)
      • R. Tibshirani . Regression shrinkage and selection via the lasso. J. R. Stat. Soc., Ser. B , 267 - 288
    18. 18)
    19. 19)
    20. 20)
    21. 21)
    22. 22)
    23. 23)
    24. 24)
    25. 25)
    26. 26)
      • Garrigues, P., El Ghaoui, L.: `An homotopy algorithm for the lasso with online observations', Proc. NIPS, 2008.
    27. 27)
      • Angelosante, D., Giannakis, G.B.: `RLS-weighted lasso for adaptive estimation of sparse signals', IEEE Int. Conf. on Acoustics, Speech, and Signal Processing (ICASSP), April 2009, Taipei, Taiwan.
    28. 28)
      • W. Luo , S.A. Billings , K.M. Tsang . On-line structure detection and parameter estimation with exponential windowing for nonlinear systems. Eur. J. Control , 291 - 304
    29. 29)
    30. 30)
      • Piroddi, L., Spinelli, W.: `A pruning method for the identification of polynomial NARMAX models', 13thIFAC Symp. on System Identification, 2003, Rotterdam, The Netherlands, p. 1108–1113.
    31. 31)
    32. 32)
    33. 33)
    34. 34)
    35. 35)
    36. 36)
      • X. Hong , C.J. Harris , M. Brown , S. Chen . Backward elimination methods for associative memory network pruning. Int. J. Hybrid Intell. Syst. , 2 , 90 - 99
    37. 37)
    38. 38)
    39. 39)
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cta.2009.0581
Loading

Related content

content/journals/10.1049/iet-cta.2009.0581
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address