A Mobile Application for Detecting and Monitoring the Development Stages of Wild Flowers and Plants

João Videira, Pedro D. Gaspar, Vasco N. G. J. Soares, João M. L. P. Caldeira

Abstract


Wild flowers and plants appear spontaneously. They form the ecological basis on which life depends. They play a fundamental role in the regeneration of natural life and the balance of ecological systems. However, this irreplaceable natural heritage is at risk of being lost due to human activity and climate change. The work presented in this paper contributes to the conservation effort. It is based on a previous study by the same authors, which identified computer vision as a suitable technological platform for detecting and monitoring the development stages of wild flowers and plants. It describes the process of developing a mobile application that uses YOLOv4 and YOLOv4-tiny convolutional neural networks to detect the stages of development of   wild flowers and plants. This application could be used by visitors in a nature park to provide information and raise awareness about the  wild flowers and plants they find along the roads and trails.

Full Text:

PDF

References


Agence France-Presse, “Chain-reaction extinctions will cascade through nature: Study | Daily Sabah.” https://www.dailysabah.com/life/environment/chain-reaction-extinctions-will-cascade-through-nature-study (accessed Jan. 29, 2023).

L. E. Grivetti and B. M. Ogle, “Value of traditional foods in meeting macro- and micronutrient needs: the wild plant connection,” Nutr Res Rev, vol. 13, no. 1, pp. 31–46, Jun. 2000, doi: 10.1079/095442200108728990.

E. Christaki and P. Florou-Paneri, “Aloe vera: A plant for many uses,” J Food Agric Environ, vol. 8, pp. 245–249, 2010.

Trevor Dines, “Plantlife - A Voice for Wildflowers - Ark Wildlife UK.” https://www.arkwildlife.co.uk/blog/plantlife-a-voice-for-wildflowers/ (accessed Jan. 29, 2023).

X. Chi et al., “Threatened medicinal plants in China: Distributions and conservation priorities,” Biol Conserv, vol. 210, Part A, pp. 89–95, Jun. 2017, doi: 10.1016/J.BIOCON.2017.04.015.

Woodstream, “Learn The Six Plant Growth Stages.” https://www.saferbrand.com/articles/plant-growth-stages (accessed Sep. 04, 2023).

João Videira, Pedro D. Gaspar, Vasco N. G. J. Soares, and João M. L. P. Caldeira, “Detecting and Monitoring the Development Stages of Wild Flowers and Plants using Computer Vision: Approaches, Challenges and Opportunities (in press),” International Journal of Advances in Intelligent Informatics (IJAIN), 2023.

PEAT GmbH, “Plantix - seu médico agrícola – Apps no Google Play.” https://play.google.com/store/apps/details?id=com.peat.GartenBank&hl=pt_PT&gl=US (accessed Oct. 06, 2022).

AIBY Inc., “Plantum - Identificar plantas – Apps no Google Play.” https://play.google.com/store/apps/details?id=plant.identification.flower.tree.leaf.identifier.identify.cat.dog.breed.nature&hl=pt_PT&gl=US (accessed Oct. 06, 2022).

N. Buch, S. A. Velastin, and J. Orwell, “A review of computer vision techniques for the analysis of urban traffic,” IEEE Transactions on Intelligent Transportation Systems, vol. 12, no. 3, pp. 920–939, Sep. 2011, doi: 10.1109/TITS.2011.2119372.

S. Xu, J. Wang, W. Shou, T. Ngo, A. M. Sadick, and X. Wang, “Computer Vision Techniques in Construction: A Critical Review,” Archives of Computational Methods in Engineering 2020 28:5, vol. 28, no. 5, pp. 3383–3397, Oct. 2020, doi: 10.1007/S11831-020-09504-3.

Z. Song, Q. Chen, Z. Huang, Y. Hua, and S. Yan, “Contextualizing object detection and classification,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37., 20015, pp. 13–27. doi: 10.1109/CVPR.2011.5995330.

Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, May 2015, doi: 10.1038/nature14539.

The MathWorks Inc., “What Is Object Detection? - MATLAB & Simulink.” https://www.mathworks.com/discovery/object-detection.html?s_tid=srchtitle_object%20detection_1 (accessed Dec. 26, 2022).

A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal Speed and Accuracy of Object Detection.” arXiv, 2020. doi: 10.48550/ARXIV.2004.10934.

G. Li, X. Huang, J. Ai, Z. Yi, and W. Xie, “Lemon-YOLO: An efficient object detection method for lemons in the natural environment,” IET Image Process, vol. 15, no. 9, pp. 1998–2009, Mar. 2021, doi: 10.1049/ipr2.12171.

A. Shill and M. A. Rahman, “Plant disease detection based on YOLOv3 and YOLOv4,” 2021 International Conference on Automation, Control and Mechatronics for Industry 4.0, ACMI 2021, pp. 1–6, Jul. 2021, doi: 10.1109/ACMI53878.2021.9528179.

J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Jun. 2016. doi: 10.1109/cvpr.2016.91.

J. Redmon and A. Farhadi, “YOLOv3: An Incremental Improvement,” Apr. 2018, Accessed: Aug. 21, 2023. [Online]. Available: https://arxiv.org/abs/1804.02767v1

The MathWorks Inc., “Anchor Boxes for Object Detection - MATLAB & Simulink.” https://www.mathworks.com/help/vision/ug/anchor-boxes-for-object-detection.html (accessed Dec. 26, 2022).

Q. Chen and Q. Xiong, “Garbage Classification Detection Based on Improved YOLOV4,” Journal of Computer and Communications, vol. 8, pp. 285–294, 2020, doi: 10.4236/jcc.2020.812023.

Z. Jiang, L. Zhao, S. Li, Y. Jia, and Z. Liquan, “Real-time object detection method based on improved YOLOv4-tiny,” Journal of Network Intelligence, vol. 7, no. 1, Nov. 2022, Accessed: Aug. 23, 2023. [Online]. Available: https://arxiv.org/abs/2011.04244v2

L. Song et al., “Object detection based on Yolov4-Tiny and Improved Bidirectional feature pyramid network,” Journal of Physics: Conference Series 2021 International Conference on Electronic Communication, Computer Science and Technology 07/01/2022-09/01/2022 Nanchang, vol. 2209, no. 1, Feb. 2022, doi: 10.1088/1742-6596/2209/1/012023.

W. Zhang et al., “Airborne infrared aircraft target detection algorithm based on YOLOv4-tiny,” Journal of Physics: Conference Series 2021 International Conference on Advances in Optics and Computational Sciences (ICAOCS) 2021 21-23 January 2021, Ottawa, Canada, vol. 1865, no. 4, Apr. 2021, doi: 10.1088/1742-6596/1865/4/042007.

Ken-ichi Ueda, Nate Agrin, and Jessica Kline, “Uma comunidade para naturalistas · iNaturalist.” https://www.inaturalist.org/ (accessed Aug. 21, 2023).

Ken-ichi Ueda, Nate Agrin, and Jessica Kline, “Uma comunidade para naturalistas · BioDiversity4All.” https://www.biodiversity4all.org/ (accessed Aug. 23, 2023).

João Videira, Pedro D. Gaspar, Vasco N. G. J. Soares, and João M. L. P. Caldeira, “MontanhaVivaApp Dataset | Kaggle.” https://www.kaggle.com/datasets/krosskrosis/montanhavivaapp-dataset (accessed Sep. 05, 2023).

Yonghye Kwon, “GitHub - developer0hye/Yolo_Label: GUI for marking bounded boxes of objects in images for training neural network YOLO.” https://github.com/developer0hye/Yolo_Label (accessed Aug. 21, 2023).

Alexey Bochkovskiy, “GitHub - AlexeyAB/darknet: YOLOv4 / Scaled-YOLOv4 / YOLO - Neural Networks for Object Detection (Windows and Linux version of Darknet).” https://github.com/AlexeyAB/darknet (accessed Aug. 21, 2023).

J. A. Cook and J. Ranstam, “Overfitting,” British Journal of Surgery, vol. 103, no. 13, p. 1814, Dec. 2016, doi: 10.1002/bjs.10244.

M. Decuyper, M. Stockhoff, S. Vandenberghe, al -, and X. Ying, “An Overview of Overfitting and its Solutions,” J Phys Conf Ser, vol. 1168, no. 2, p. 022022, Feb. 2019, doi: 10.1088/1742-6596/1168/2/022022.

L. Prechelt, “Early Stopping - But When?,” pp. 55–69, 1998, doi: 10.1007/3-540-49430-8_3.

The Interaction Design Foundation, “What is User Centered Design?” https://www.interaction-design.org/literature/topics/user-centered-design (accessed Sep. 08, 2023).

Eastern Peak, “Iterative Development.” https://easternpeak.com/definition/iterative-development/ (accessed Sep. 08, 2023).

Matt Rae, “What is Adobe XD and What is it Used for?” https://www.adobe.com/products/xd/learn/get-started/what-is-adobe-xd-used-for.html (accessed Sep. 08, 2023).

Pavel Gorbachenko, “Functional vs Non-Functional Requirements | Enkonix.” https://enkonix.com/blog/functional-requirements-vs-non-functional/ (accessed Sep. 08, 2023).

IBM, “Use-case diagrams - IBM Documentation.” https://www.ibm.com/docs/en/rational-soft-arch/9.6.1?topic=diagrams-use-case (accessed Sep. 08, 2023).

Google and JetBrains, “Android Studio & App Tools - Android Developers.” https://developer.android.com/studio (accessed Sep. 08, 2023).

D. Richard Hipp, “SQLite.” https://www.sqlite.org/index.html (accessed Sep. 08, 2023).

Armin Ronacher, “Welcome to Flask.” https://flask.palletsprojects.com/en/2.3.x/ (accessed Sep. 08, 2023).

João Videira, Pedro D. Gaspar, Vasco N. G. J. Soares, and João M. L. P. Caldeira, “MontanhaVivaApp – Google Drive.” https://drive.google.com/drive/u/2/folders/1FX6pwvDgV2lN9u3EwtH66DhPunIPJ3AT (accessed Aug. 24, 2023).

Việt Hùng, “GitHub - hunglc007/tensorflow-yolov4-tflite: YOLOv4, YOLOv4-tiny, YOLOv3, YOLOv3-tiny Implemented in Tensorflow 2.0, Android. Convert YOLO v4 .weights tensorflow, tensorrt and tflite.” https://github.com/hunglc007/tensorflow-yolov4-tflite (accessed Aug. 21, 2023).

João Videira, Pedro D. Gaspar, Vasco N. G. J. Soares, and João M. L. P. Caldeira, “videira202011/MontanhaVivaApp.” https://github.com/videira202011/MontanhaVivaApp (accessed Sep. 04, 2023).

Oracle, “Creating and Using Packages (The JavaTM Tutorials > Learning the Java Language > Packages).” https://docs.oracle.com/javase/tutorial/java/package/packages.html (accessed Sep. 05, 2023).




DOI: https://doi.org/10.31449/inf.v48i6.5645

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.