Deteksi Ikan Menggunakan MEtode Faster R-CNN

  • Yurni Oktarina Politeknik Negeri Sriwijaya
  • Yolanda Eka Pratiwi Politeknik Negeri Sriwijaya
  • Tresna Dewi Dewi Politeknik Negeri Sriwijaya
DOI: https://doi.org/10.52158/jasens.v5i2.1132
I will put the dimension here
Keywords: Fish Detection, Deep Learning, Faster R-CNN, Computer Vision, Model Evaluation.

Abstract

Automatic fish detection in video is a challenging task in the field of computer vision, which can be addressed using deep learning methods. This study proposes the use of the Faster Region-based Convolutional Neural Network (Faster R-CNN) to detect two types of fish, namely Manfish and Lemonfish, in video data. The dataset was constructed by extracting frames from video and processing them using the Roboflow platform. The model was trained and tested using pre-split training and testing sets. The training process was conducted over 40 epochs using the Adam optimization algorithm to improve detection accuracy. Model evaluation was carried out using several metrics, including Precision, Recall, mean Average Precision (mAP), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and R-squared (R²). The results show that the model achieved a precision of 94% and an accuracy of 87% for the Lemonfish class, and a precision of 95% and an accuracy of 89% for the Manfish class. These findings indicate that the model is capable of accurately detecting fish, delivering high detection performance, and effectively recognizing objects in video frames.

References

[1] R. F. Syreen and K. Merriliance, “A Survey on Underwater Fish Species Detection and Classification International Journal of Computer Sciences and Engineering Open Access A Survey on Underwater Fish Species Detection and Classification,” no. April 2019, 2020, doi: 10.26438/ijcse/v7si8.9598.
[2] S. Ren, K. He, R. Girshick, and J. Sun, “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks,” p. 9, 2015.
[3] A. Salman et al., “Fish species classification in unconstrained underwater environments based on deep learning,” Limnol. Oceanogr. Methods, vol. 14, no. 9, pp. 570–585, 2016, doi: 10.1002/lom3.10113.
[4] M. J. Islam, Y. Xia, and J. Sattar, “Fast Underwater Image Enhancement for Improved Visual Perception,” IEEE Robot. Autom. Lett., vol. 5, no. 2, pp. 3227–3234, 2020, doi: 10.1109/LRA.2020.2974710.
[5] A. Salman et al., “Automatic fish detection in underwater videos by a deep neural network-based hybrid motion learning system,” ICES J. Mar. Sci., vol. 77, no. 4, pp. 1295–1307, 2020, doi: 10.1093/icesjms/fsz025.
[6] H. Kim, “Deep Learning,” Artif. Intell. 6G, vol. 22, no. 4, pp. 247–303, 2022, doi: 10.1007/978-3-030-95041-5_6.
[7] L. Liu et al., “Deep Learning for Generic Object Detection: A Survey,” Int. J. Comput. Vis., vol. 128, no. 2, pp. 261–318, 2020, doi: 10.1007/s11263-019-01247-4.
[8] J. Huang, V. Rathod, and C. Sun, “Speed/accuracy trade-offs for modern convolutional object detectors,” Proc. IEEE CVPR, 2017.
[9] S. Zhang, L. Wen, X. Bian, Z. Lei, and S. Z. Li, “RefinementNet,” Proc. IEEE Conf. Comput. Vis. Pattern Recognit., pp. 4203–4212, 2018.
[10] C.-Y. Fu, M. Shvets, and A. C. Berg, “RetinaMask: Learning to predict masks improves state-of-the-art single-shot detection for free,” 2019, [Online]. Available: http://arxiv.org/abs/1901.03353
[11] N. Ghatwary, X. Ye, and M. Zolgharni, “Esophageal Abnormality Detection Using DenseNet Based Faster R-CNN with Gabor Features,” IEEE Access, vol. 7, pp. 84374–84385, 2019, doi: 10.1109/ACCESS.2019.2925585.
[12] D. P. Kingma and J. L. Ba, “Adam: A method for stochastic optimization,” 3rd Int. Conf. Learn. Represent. ICLR 2015 - Conf. Track Proc., pp. 1–15, 2015.
[13] R. Girshick, “Fast R-CNN,” 2015.
[14] A. Tharwat, “Classification assessment methods,” Appl. Comput. Informatics, vol. 17, no. 1, pp. 168–192, 2018, doi: 10.1016/j.aci.2018.08.003.
[15] T. Chai and R. R. Draxler, “Root mean square error (RMSE) or mean absolute error (MAE)? -Arguments against avoiding RMSE in the literature,” Geosci. Model Dev., vol. 7, no. 3, pp. 1247–1250, 2014, doi: 10.5194/gmd-7-1247-2014.
[16] C. J. Willmott and K. Matsuura, “Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance,” Clim. Res., vol. 30, no. 1, pp. 79–82, 2005, doi: 10.3354/cr030079.
[17] D. L. J. Alexander, A. Tropsha, and David A. Winkler, “Beware of R2: simple, unambiguous assessment of the prediction accuracy of QSAR and QSPR models,” Chem. Inf. Model., vol. 55, no. 7, pp. 1316–1322, 2015.
[18] A. Gosiewska, K. Woźnica, and P. Biecek, “Interpretable meta-score for model performance,” Nat. Mach. Intell., vol. 4, no. 9, pp. 792–800, 2022, doi: 10.1038/s42256-022-00531-2.
[19] C. Miller, T. Portlock, D. M. Nyaga, and J. M. O’Sullivan, “A review of model evaluation metrics for machine learning in genetics and genomics,” Front. Bioinforma., vol. 4, no. September, pp. 1–13, 2024, doi: 10.3389/fbinf.2024.1457619.
[20] D. M. W. Powers, “Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation,” pp. 37–63, 2020, [Online]. Available: http://arxiv.org/abs/2010.16061
Published
2024-12-31
How to Cite
Oktarina, Y., Yolanda Eka Pratiwi, & Dewi, T. D. (2024). Deteksi Ikan Menggunakan MEtode Faster R-CNN. Journal of Applied Smart Electrical Network and Systems, 5(2), 41-48. https://doi.org/10.52158/jasens.v5i2.1132