Adding depth to weed sensing science
03 Oct 2012
The next generation of precision weed sensing technology is a step closer to reality, giving hope to the grains and cotton industries of targeting a wider range of problem species and further reducing their herbicide usage.
Results from commercial spot sprayers show 50-90% reduction in herbicide when used in fallow situations, but take-up of weed sensing machinery by cropping industries has been limited due to its inability to discriminate between different plant species.
But new research conducted by PhD student Steven Rees and Dr Cheryl McCarthy of the National Centre for Engineering in Agriculture (NCEA) based at the University of Southern Queensland, has developed an improved imaging system that is a major step forward towards automated species specific weed spraying.
“Our research demonstrates that discrimination of weed species in real-world on-farm conditions is achievable using combined colour and depth image analysis,” Dr McCarthy said.
“The proof-of-concept technology demonstrates discrimination of weed species by using cameras to detect broadleaf and grasses, and even has the potential for individual broadleaf or grass species to be identified automatically.
“It is an important breakthrough because alternative weed control strategies are required as the cotton and grains industries face growing herbicide resistance in minimum and no-till farming systems.
“This technology will contribute to integrated weed management practices and might be used ultimately to scout, map and selectively spray on-farm weed infestations and inform management strategies and tank mixes.”
The research was funded by the Rural Industries Research and Development Corporation (RIRDC), which managed the Australian Government’s National Weeds Research and Productivity Program.
Weeds cost Australian agriculture more than $4 billion dollars each year, including control costs and lost production.
The RIRDC Weeds Program, which concluded on June 30 this year, invested more than $12m in over 50 projects that would improve the knowledge and understanding of weeds, and provide land managers with tools to control weeds and reduce their impact on agriculture and biodiversity.
Dr McCarthy said more research was needed to further advance the technology so that it could be integrated with a weed classifier system linked to the spray trigger.
If the technology was realised, she said the reduction in herbicide usage, coupled with precise knowledge of the species of weeds present, would enable a much larger range of herbicides to be viable, therefore reducing the risk of herbicide resistance developing.
Existing weed sensor imaging technology struggles to segment leaf from weeds – a difficult task when more than one weed species are growing together, often at different heights. Commercial systems therefore target any green vegetation on a soil or stubble background.
Researchers in this field have been seeking to improve machine vision-based weed discrimination by targeting the analyses of colour, shape and texture.
A 2008 review of weed control systems found that although results between 65% and 95% accuracies can be achieved, this can only occur in ideal conditions. The systems were found to be unsuitable to real-world conditions where leaf shape can be distorted by numerous factors and crops and weed leaves often occlude each other.
Against this background, the NCEA project set out to create a prototype machine which could identify problem weeds in a real-world setting - this meant dealing with issues including inconsistent light sources, interference from ground cover (i.e. stubble) and occlusions.
Dr McCarthy said the challenge was to develop a precision sensing system with the “capability to extract whole leaves for classification from a scene containing many weeds”.
The team tested two camera systems - a combined colour and depth camera and a high resolution colour camera - for their ability to capture effective images of weeds for analysis in real-time.
A three-metre unit was developed to house and provide shading for the two camera systems whilst being towed in the field in paddock trials on Queensland’s Darling Downs.
The unit was used to collect weeds images under expected operational conditions of the machine vision system and targeted fleabane, sowthistle, liverseed, feathertop Rhodes grass, wild sorghum and wild oats.
The results encouraged the researchers to develop a new image analysis technique that can discriminate between grass and broadleaf species, and between different broadleaf species. Both active and passive methods of depth data generation were investigated so that weed segmentation based on height could be used as a pre-process to the more “computationally-intense” colour-based image analysis.
“Automated analysis of colour images enabled extraction of individual grass leaves (liverseed, wild oats, feathertop Rhodes grass and wild sorghum) and discrimination of grasses from broadleaf weeds (sowthistle and fleabane),” Dr McCarthy said.
“But a greater resolution was required for the extraction of the features of broadleaf species, than for grass species. So an active depth sensor was found which reduced image complexity by at least 80% for images containing weeds at a distinct height – for example, standing grass amongst low-lying broadleaves and grasses.”
The subsequent results demonstrated that discrimination of weed species in real-world on-farm conditions is achievable by using combined colour and depth image analysis.
The NCEA is now further testing its research through grants from the Sugar Research & Development Corporation (SRDC), Horticulture Australia Limited (HAL) and Botanical Resources Australia (BRA), which it hopes will develop the technology from the proof-of-concept stage towards commercialisation.
Media contact: Michael Thomson, Cox Inall Communications, 07 4927 0805 / 0408 819 666.