Performance of a mixed Lagrange time delay estimation autoregressive (MLTDEAR) model for single-image signal- to-noise ratio estimation in scanning electron microscopy

Citation

SIM, K. S. and CHUAH, H. T. and ZHENG, C. (2005) Performance of a mixed Lagrange time delay estimation autoregressive (MLTDEAR) model for single-image signal- to-noise ratio estimation in scanning electron microscopy. Journal of Microscopy, 219 (1). pp. 1-17. ISSN 0022-2720

Full text not available from this repository.

Abstract

A novel technique based on the statistical autoregressive (AR) model has recently been developed as a solution to estimate the signal-to-noise ratio (SNR) in scanning electron microscope (SEM) images. In another research study, the authors also developed an algorithm by cascading the AR model with the Lagrange time delay (LTD) estimator. This technique is named the mixed Lagrange time delay estimation autoregressive (MLTDEAR) model. In this paper, the fundamental performance limits for the problem of single-image SNR estimation as derived from the Cramer-Rao inequality is presented. We compared the experimental performances of several existing methods - the simple method, the first-order linear interpolator, the AR-based estimator as well as the MLTDEAR method - with respect to this performance bound. In a few test cases involving different images, the efficiency of the MLTDEAR single-image estimation technique proved to be significantly better than that of the other three methods. Study of the effect of different SEM setting conditions that affect the autocorrelation function curve is also discussed.

Item Type: Article
Subjects: Q Science > Q Science (General)
Divisions: Faculty of Engineering and Technology (FET)
Depositing User: Ms Rosnani Abd Wahab
Date Deposited: 12 Aug 2011 01:25
Last Modified: 12 Aug 2011 01:25
URII: http://shdl.mmu.edu.my/id/eprint/2213

Downloads

Downloads per month over past year

View ItemEdit (login required)