Citrus sorting by identification of the most common defects using multispectral computer vision
Derechos de accesoopenAccess
MetadadesMostra el registre complet de l'element
Cita bibliográficaBlasco, J., Aleixos, N., Gomez, J., Moltó, E. (2007). Citrus sorting by identification of the most common defects using multispectral computer vision. Journal of Food Engineering, 83(3), 384-393.
The presence of skin defects is one of the most influential factors in the price of fruit. The detection of defects during packing operations ensures that only fruits with a good quality reach the market. Moreover, the identification of the type of each defect will increase both the quality of the fruit and also the producer's profit. At the present time, fruit with slight defects is marketed together with sound fruit, thus depreciating the quality of the batch, or it is removed together with seriously damaged fruit, thereby causing economic losses. Most current computer vision systems used in the automatic quality inspection of food are limited to the visible region of the electromagnetic spectrum as they tend to imitate the human eye. However, non-visible information, such as that provided by near-infrared or ultraviolet regions of the spectrum, can improve the inspection by detecting specific defects or allowing the detection of non-visible damages. This work summarises our research in the application of near-infrared, ultraviolet and fluorescence computer vision systems in the identification of the most common defects of citrus fruits, and proposes a fruit sorting algorithm that combines this different spectral information (including visible) to classify fruit according to the type of defect. Results showed that the contribution of non-visible information can improve the detection and identification of some defects. Compared with the results from colour images, the detection accuracy of anthracnose increased from 86% by using NIR images; and the accuracy of green mould was increased from 65% to 94% by using images of fluorescence. (c) 2007 Elsevier Ltd. All rights reserved.