Abstract:
To address the problems of missed detection, false detection and low detection efficiency in the appearance defect recognition of small cigarette packs under complex backgrounds, an appearance defect recognition method for small cigarette packs based on the improved YOLOv8 model is proposed. Firstly, the input images are preprocessed by a chain-of-thought-based adaptive image enhancement method to eliminate background noise in the images. Secondly, an improved YOLOv8 model is constructed, in which the lightweight Coordinate Attention (CA) module is used to embed the positional information of features into the channel attention module, and the Deformable Convolutional Network (DCN) is adopted to enable the convolution kernel to adaptively adjust the receptive field shape, so as to accurately capture the irregular contour features such as seal wrinkles and incomplete graphics and text of small cigarette packs. Finally, the accurate classification of defect types is realized through a joint weight function. The results show that compared with the original YOLOv8 model, the accuracy, detection precision and recall of the improved YOLOv8 model are increased by 7.0, 4.9 and 7.0 percentage points respectively, and the model size is significantly reduced; compared with mainstream object detection models such as Faster R-CNN, the improved YOLOv8 model has obviously higher accuracy, detection precision and recall, and exhibits excellent performance in the defect recognition task of small cigarette packs; in practical application, the defect recognition is carried out on 20 000 real-time collected images of small cigarette packs, with a recognition accuracy of 99.995%, a recall of 100%, a precision of 96.30% and an average detection speed of 50 FPS, and the proposed method can effectively identify different types of defects of small cigarette packs and meet the application requirements under actual working conditions. This study provides technical support for the appearance defect detection of small cigarette packs.