The accuracy of microwave radiometers is affected by the deviation of the detector from the ideal square-law behaviour. A method is presented for determining the error caused by higher order nonlinearities, based on the experimental characterisation of the nonlinearities with a two-tone test. In actual operation with Gaussian noise as the input signal, the error caused by higher order nonlinearities is found to be greater than that with sinusoidal input signal.
References
-
-
1)
-
D. Middleton
.
Some general results in the theory of noise through non-linear devices.
Quart. Appl. Math.
,
445 -
498
-
2)
-
P.J. Davis
.
(1975)
Interpolation & approximation.
-
3)
-
V.S. Reinhardt ,
Y.C. Shih ,
P.A. Toth ,
S.C. Reynolds ,
A.L. Berman
.
Methods for measuring the power linearity of microwave detectors forradiometric applications.
IEEE Trans.
,
715 -
720
-
4)
-
Trier, M.: `Radiometer receiver linearity characterization by intermodulation distortionmethod', IEEE IMTC Conf. Record, 1993, p. 99–102.
-
5)
-
Reinhardt, V.S., Shih, Y.C., Toth, P.A., Reynolds, S.C., Berman, A.L.: `Methods for measuring the power linearity of microwave detectors forradiometric applications', IEEE MTT-Symp. Dig., 1994, p. 1477–1480.
http://iet.metastore.ingenta.com/content/journals/10.1049/el_19960109
Related content
content/journals/10.1049/el_19960109
pub_keyword,iet_inspecKeyword,pub_concept
6
6