A New Framework for Efficient Low-light Image Enhancement using Approximated Gaussian Process

Citation

Yuen, Peng Loh (2020) A New Framework for Efficient Low-light Image Enhancement using Approximated Gaussian Process. International Journal of Engineering Trends and Technology. pp. 36-43. ISSN 2231-5381

[img] Text
90.pdf - Published Version
Restricted to Repository staff only

Download (457kB)

Abstract

Gaussian Process (GP) is a robust distribution modeling technique that is very promising for computer vision systems. In particular, its multivariate distribution modeling is especially effective for low-light image enhancement where localized enhancement is required to address the over- and under-enhancement problem, and also retrieval of features which has been lost due to low illumination. However, GP lacks practicality due to its computation complexity that increases cubically following data increment. This paper proposes a sparse GP regression based solution whereby clustering is exploited to reduce the training cost of a GP model. Instead of utilizing all values from an image, clustering groups similar training pixels or image patches pairs into clusters and the cluster centers are used to train an approximate GP enhancement model. Experiments conducted showed the proposed framework can achieve training time reduction of as much as 75% from the baseline. In line with this, the proposed approach also improved the enhancement performance in both PSNR and features retrieval metric, and is competitive with the current state-of-the-art.

Item Type: Article
Uncontrolled Keywords: Gaussian Process, image enhancement, low-light, sparse approximation.
Subjects: Q Science > QA Mathematics > QA273-280 Probabilities. Mathematical statistics
Divisions: Faculty of Computing and Informatics (FCI)
Depositing User: Ms Rosnani Abd Wahab
Date Deposited: 29 Sep 2021 08:32
Last Modified: 29 Sep 2021 08:32
URII: http://shdl.mmu.edu.my/id/eprint/8428

Downloads

Downloads per month over past year

View ItemEdit (login required)