Ball Tracking System Using Kalman Filter and Deep Learning

Main Article Content

Apisit Prempree
jiraporn kiatwuthiamorn
Chaipichit Cumpim

Abstract

This paper presents a method for detecting the positions of tennis balls in video files using deep learning for ball position detection in video images. However, there are some image frames where deep learning cannot accurately detect the ball positions due to the small size of the tennis balls and their fast movement in that particular frame. To address this issue, when the deep learning system fails to track the movement of the tennis balls, a Kalman filter is used to estimate the positions of the balls in those frames and replace the missing positions with the estimated values. Experimental results demonstrate that the proposed method can closely approximate the actual positions of the tennis balls in the image frames with high accuracy.

Article Details

How to Cite
Prempree, A., kiatwuthiamorn, jiraporn, & Cumpim, C. (2024). Ball Tracking System Using Kalman Filter and Deep Learning. Journal of Advanced Development in Engineering and Science, 14(40), 108–133. Retrieved from https://ph03.tci-thaijo.org/index.php/pitjournal/article/view/592
Section
Research Article

References

McCulloch, W. S. & Pitts, W. (1943). A Logical Calculus of the Ideas Immanent in Nervous Activity. Bulletin of Mathematical Biophysics, 5, 115–133.

LeCun, Y., et al. (2015). Deep learning. Nature, 521, 436–444.

LeCun, Y. (2014). LeNet-5, convolutional neural networks. Available from https://yann. lecun.com/exdb/lenet/. Accessed date: 2 May 2023.

Bengio, Y. (2009). Learning Deep Architectures for AI. Foundations and Trends® in Machine Learning, 2(1), 1-127.

Hongboonmee, N. & Jantawong, N. (2020). Apply of Deep Learning Techniques to Measure the Sweetness Level of Watermelon via Smartphone. Journal of Information Science and Technology, 10(1), 59-69. (in Thai)

Thibhodee, S. & Viyanon, W. (2021). An Application of Evaluation of Human Sketches using Deep Learning Technique. In The 12th International Conference on Advances in Information Technology (p. 14(1-7)). 29 June - 1 July 2021, Bangkok, Thailand.

Singkhornart, T. (2021). Deep Learning for Video Subtitle Detection and Recognition, (Master thesis, Mahasarakham University). (in Thai)

Srichocksittikul, P. (2021). Design of Deep Learning Architecture for Classification of Orchid Diseases, (Master thesis, Thammasat University). (in Thai)

Chaichana, C., et al. (2022). A Development of Information Retrieval System of Indigo Fabric Pattern with Deep Learning Techniques. UTK Research Journal, 16(1), 68-83. (in Thai)

Wisutmetheekorn, P., et al. (2023). Digital Weighing Scale with Fruit and Vegetable Identification using Deep Learning Technique. Journal of King Mongkut’s University of Technology North Bangkok, 33(2), 468-479. (in Thai)

Kalman, R. E. (1960). A New Approach to Linear Filtering and Prediction Problems. Journal of Basic Engineering, 82(1), 35-45.

Timpitak, S. & Prempraneerach, P. (2012). Remotely Operated Vehicle with Depth Control. In The 3rd TSME International Conference on Mechanical Engineering. 24 - 27 October, 2012, Chiang Rai, Thailand.

Rassameyoungtong, J. (2012). Noise Reduction in Speech Compression by Kalman Filter, (Master thesis, Rajamangala University of Technology Thanyaburi). (in Thai)

Keatmanee, C. (2013). Trajectory Prediction Modeling of Car Traveling Through Straight and Curved Lane Boundaries Using Kalman Filter (Research reports). Bangkok: Sripatum University. (in Thai)

Sinchai, S., et al. (2015). Set-Point Recompense in Filling Weigherusing Kalman Filter. Ladkrabang Engineering Journal, 32(3), 1-6. (in Thai)

Wiengchanda, P. & Depaiwa, N. (2017). Unscented Kalman Filter for Enhancement of Angle Estimated of Tilt Sensing. In The 31st Conference of Mechanical Engineering Network of Thailand (AME-08). 4 - 7 July, 2017, Nakhon Nayok, Thailand. (in Thai)

Kheowree, T. (2018). Altitude control of an adaptive controller combine with Kalman filter for a mini-quadrotor aircraft. RMUTSB Academic Journal, 6(2), 148-156.