Please use this identifier to cite or link to this item: https://hdl.handle.net/1959.11/30941
Title: Fruit load estimation in mango orchards - a method comparison
Contributor(s): Underwood, J P (author); Rahman, M M  (author)orcid ; Robson, A  (author)orcid ; Walsh, K B (author); Koirala, A (author); Wang, Z (author)
Publication Date: 2018
Open Access: Yes
Handle Link: https://hdl.handle.net/1959.11/30941
Open Access Link: https://research.qut.edu.au/future-farming/projects/icra-2018-workshop-on-robotic-vision-and-action-in-agriculture/Open Access Link
Abstract: The fruit load of entire mango orchards was estimated well before harvest using (i) in-field machine vision on mobile platforms and (ii) WorldView-3 satellite imagery. For in-field machine vision, two imaging platforms were utilized, with (a) day time imaging with LiDAR based tree segmentation and multiple views per tree, and (b) night time imaging system using two images per tree. The machine vision approaches involved training of neural networks with image snips from one orchard only, followed by use for all other orchards (varying in location and cultivar). Estimates of fruit load per tree achieved up to a R2 = 0.88 and a RMSE = 22.5 fruit/tree against harvest fruit count per tree (n = 18 trees per orchard). With satellite imaging, a regression was established between a number of spectral indices and fruit number for a set (n=18) of trees in each orchard (example: R2 = 0.57, RMSE = 22 fruit/tree), and this model applied across all tree associated pixels per orchard. The weighted average percentage error on packhouse counts (weighted by packhouse fruit numbers) was 6.0, 8.8 and 9.9% for the day imaging system, night imaging machine vision system and the satellite method, respectively, averaged across all orchards assessed. Additionally, fruit sizing was achieved with a RMSE = 5 mm (on fruit length and width). These estimates are useful for harvest resource planning and marketing and set the foundation for automated harvest.
Publication Type: Conference Publication
Conference Details: ICRA 2018 Workshop on Robotic Vision and Action in Agriculture, Brisbane, Australia, 21st - 25th May, 2018
Source of Publication: Robotic Vision and Action in Agriculture: the future of agri-food systems and its deployment to the real-world, p. 1-6
Publisher: Queensland University of Technology
Place of Publication: Brisbane, Australia
Fields of Research (FoR) 2020: 300206 Agricultural spatial analysis and modelling
300802 Horticultural crop growth and development
300207 Agricultural systems analysis and modelling
Socio-Economic Objective (SEO) 2020: 260515 Tree nuts (excl. almonds and macadamias)
Peer Reviewed: Yes
HERDC Category Description: E2 Non-Refereed Scholarly Conference Publication
Publisher/associated links: https://research.qut.edu.au/future-farming/projects/icra-2018-workshop-on-robotic-vision-and-action-in-agriculture/
Appears in Collections:Conference Publication
School of Science and Technology

Files in This Item:
2 files
File Description SizeFormat 
Show full item record

Page view(s)

1,456
checked on Mar 7, 2023

Download(s)

6
checked on Mar 7, 2023
Google Media

Google ScholarTM

Check


Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.