Please use this identifier to cite or link to this item:
https://hdl.handle.net/1959.11/21357
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Sadgrove, Edmund | en |
dc.contributor.author | Falzon, Gregory | en |
dc.contributor.author | Miron, David J | en |
dc.contributor.author | Lamb, David | en |
dc.date.accessioned | 2017-06-14T15:19:00Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | Computers and Electronics in Agriculture, v.139, p. 204-212 | en |
dc.identifier.issn | 1872-7107 | en |
dc.identifier.issn | 0168-1699 | en |
dc.identifier.uri | https://hdl.handle.net/1959.11/21357 | - |
dc.description.abstract | Object detection is an essential function of robotics based agricultural systems and many algorithms exist for this purpose. Colour although an important characteristic is often avoided in place of faster grey-scale implementations or is only used in an rudimentary arrangement. This study presents the Colour Feature Extreme Learning Machine (CF-ELM), which is an implementation of the extreme learning machine (ELM), with a partially connected hidden layer and a fully connected output layer, taking three colour inputs instead of the standard grey-scale input. The CF-ELM was tested with three different colour systems including HSV, RGB and Y'UV and compared for time and accuracy against the standard greyscale ELM. The four implementations were tested on three different datasets including weed detection, vehicle detection and stock detection. It was found that the colour implementation performed better overall for all three datasets and the Y'UV was best performing colour system on all tested datasets. With the Y'UV delivering the highest accuracy in weed detection at 84%, 96% in vehicle detection and 86% in stock detection. Along side the CF-ELM, an algorithm is introduced for desktop based classification of objects within a pastoral landscape, with individual speeds between 0.06s and 0.18s for a single image, tested within each colour space. The algorithm is designed for use in a scenario that provides difficult and unpredictable terrain, making it ideal for use in an agricultural or pastoral landscape. | en |
dc.language | en | en |
dc.publisher | Elsevier BV | en |
dc.relation.ispartof | Computers and Electronics in Agriculture | en |
dc.title | Fast object detection in pastoral landscapes using a Colour Feature Extreme Learning Machine | en |
dc.type | Journal Article | en |
dc.identifier.doi | 10.1016/j.compag.2017.05.017 | en |
dc.subject.keywords | Agricultural Spatial Analysis and Modelling | en |
dc.subject.keywords | Computer Graphics | en |
dc.subject.keywords | Computer Vision | en |
local.contributor.firstname | Edmund | en |
local.contributor.firstname | Gregory | en |
local.contributor.firstname | David J | en |
local.contributor.firstname | David | en |
local.subject.for2008 | 070104 Agricultural Spatial Analysis and Modelling | en |
local.subject.for2008 | 080103 Computer Graphics | en |
local.subject.for2008 | 080104 Computer Vision | en |
local.subject.seo2008 | 960904 Farmland, Arable Cropland and Permanent Cropland Land Management | en |
local.profile.school | School of Science and Technology | en |
local.profile.school | School of Science and Technology | en |
local.profile.school | Research Services | en |
local.profile.school | School of Science and Technology | en |
local.profile.email | esadgro2@une.edu.au | en |
local.profile.email | gfalzon2@une.edu.au | en |
local.profile.email | dmiron@une.edu.au | en |
local.profile.email | dlamb@une.edu.au | en |
local.output.category | C1 | en |
local.record.place | au | en |
local.record.institution | University of New England | en |
local.identifier.epublicationsrecord | une-20170613-162813 | en |
local.publisher.place | Netherlands | en |
local.format.startpage | 204 | en |
local.format.endpage | 212 | en |
local.identifier.scopusid | 85019773323 | en |
local.peerreviewed | Yes | en |
local.identifier.volume | 139 | en |
local.contributor.lastname | Sadgrove | en |
local.contributor.lastname | Falzon | en |
local.contributor.lastname | Miron | en |
local.contributor.lastname | Lamb | en |
dc.identifier.staff | une-id:esadgro2 | en |
dc.identifier.staff | une-id:gfalzon2 | en |
dc.identifier.staff | une-id:dmiron | en |
dc.identifier.staff | une-id:dlamb | en |
local.profile.orcid | 0000-0002-8710-9900 | en |
local.profile.orcid | 0000-0002-1989-9357 | en |
local.profile.orcid | 0000-0003-2157-5439 | en |
local.profile.orcid | 0000-0002-2917-2231 | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.identifier.unepublicationid | une:21550 | en |
local.identifier.handle | https://hdl.handle.net/1959.11/21357 | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
local.title.maintitle | Fast object detection in pastoral landscapes using a Colour Feature Extreme Learning Machine | en |
local.output.categorydescription | C1 Refereed Article in a Scholarly Journal | en |
local.search.author | Sadgrove, Edmund | en |
local.search.author | Falzon, Gregory | en |
local.search.author | Miron, David J | en |
local.search.author | Lamb, David | en |
local.uneassociation | Unknown | en |
local.identifier.wosid | 000404320100019 | en |
local.year.published | 2017 | en |
local.fileurl.closedpublished | https://rune.une.edu.au/web/retrieve/32d56a41-92d9-473e-b80e-057ee8d3588e | en |
local.subject.for2020 | 300206 Agricultural spatial analysis and modelling | en |
local.subject.for2020 | 460702 Computer graphics | en |
local.subject.for2020 | 460304 Computer vision | en |
local.subject.seo2020 | 180603 Evaluation, allocation, and impacts of land use | en |
local.subject.seo2020 | 180607 Terrestrial erosion | en |
Appears in Collections: | Journal Article School of Science and Technology |
Files in This Item:
File | Description | Size | Format |
---|
Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.