Optimization Method of Basketball Match Evaluation Based on Computer Vision and Image Processing

Zhaohui Xie, Guangyu Wu

Abstract


The accuracy and stability of real-time target detection in computer vision during basketball games has always been a challenge. In light of this, the study examines the shortcomings of computer vision systems for target tracking and target detection first. On this basis, it also introduces the faster region convolutional neural network algorithm for target detection optimization. Concurrently, to further enhance the model's target tracking proficiency, the study incorporates the enhanced pyramid optical flow algorithm and refines it through the application of Kalman filtering. Ultimately, a pioneering target detection and tracking model is proposed, integrating the optical flow algorithm and the faster region convolutional neural network. The experimental results indicated that compared with the same type of target detection and tracking models, the novel model proposed by the research had relatively excellent target tracking and localization effects. The error between its true trajectory and tracking trajectory was less than 3%. Moreover, the overall performance of the research-proposed model had the highest score in the three index tests of target position precision, accuracy and recall mean, and target tracking accuracy. The highest value of target position precision reached 93.57%, the highest value of accuracy and recall mean reached 95.02%, and the highest value of target tracking accuracy reached 91.57%. In conclusion, the novel target detection and tracking model proposed in the study demonstrates the capacity to markedly enhance the detection and recognition performance of the existing target detection and tracking model. Furthermore, it offers substantial support for the advancement of optimization methods for evaluating basketball games.


Full Text:

PDF


DOI: https://doi.org/10.31449/inf.v48i23.6696

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.