International Journal of Agricultural and Biological Engineering
Authors: Su Baofeng, Xue Jinru, Xie Chunyu, Fang Yulin, Song Yuyang, Sigfredo Fuentes
College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling 712100, China;
College of Enology, Northwest A&F University, Yangling 712100, China;
Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville 3010, Australia)
Accurate data acquisition and analysis to obtain crop canopy information are critical steps to understand plant growth dynamics and to assess the potential impacts of biotic or abiotic stresses on plant development. A versatile and easy to use monitoring system will allow researchers and growers to improve the follow-up management strategies within farms once potential problems have been detected. This study reviewed existing remote sensing platforms and relevant information applied to crops and specifically grapevines to equip a simple Unmanned Aerial Vehicle (UAV) using a visible high definition RGB camera. The objective of the proposed Unmanned Aerial System (UAS) was to implement a Digital Surface Model (DSM) in order to obtain accurate information about the affected or missing grapevines that can be attributed to potential biotic or abiotic stress effects. The analysis process started with a three-dimensional (3D) reconstruction from the RGB images collected from grapevines using the UAS and the Structure from Motion (SfM) technique to obtain the DSM applied on a per-plant basis. Then, the DSM was expressed as greyscale images according to the halftone technique to finally extract the information of affected and missing grapevines using computer vision algorithms based on canopy cover measurement and classification. To validate the automated method proposed, each grapevine row was visually inspected within the study area. The inspection was then compared to the digital assessment using the proposed UAS in order to validate calculations of affected and missing grapevines for the whole studied vineyard. Results showed that the percentage of affected and missing grapevines was 9.5% and 7.3%, respectively from the area studied. Therefore, for this specific study, the abiotic stress that affected the experimental vineyard (frost) impacted a total of 16.8 % of plants. This study provided a new method for automatically surveying affected or missing grapevines in the field and an evaluation tool for plant growth conditions, which can be implemented for other uses such as canopy management, irrigation scheduling and other precision agricultural applications.
How much Australian wine growers spend fornew technologies like UAV’s, sensors, robots?
There is very scattered information in this aspect and there are no formal studies about formal spending of consumers, specifically from winegrowers in Australia. Maybe Wine Australia will have some more information in this aspect, specially for funding money spent on research and applications for the wine industry in Australia. The latter, since Australian Grape growers pay a levy every year, which Wine Australia administer for research that benefits the industry. These funds are given in a competitive basis to research institutions. In 2015 there was a specific call called Digital Viticulture, which addressed all these topics. This in a sense can serve as a scope that we are in early stages in the formal application of this technology for commercial purposes. This beside what you can find in advertisements from private companies offering services using drones and robots for management either in the irrigation, fertilization, pest and disease control, etc.
Link to the full interview: CLICK HERE
Dr Sigfredo Fuentes is a plant physiologist and engineer at the University of Melbourne whose end game is the smart farm.
In his mind, it won’t be restricted to a single crop – maybe it’ll grow peaches and nectarines and vines as well – all of which are mapped out digitally.
And at the heart of the operation will be the drone, autonomous and equipped to collect all the data it needs to monitor the entire farm down to the last plant or tree.
And it’s fast – it can cover 2500 hectares in a day, which is about 2000 football fields.
Link to full article: CLICK HERE
Sensors for Agriculture/Forestry Research and Applications
Call for Papers
Challenges imposed by climate change have pushed forward significant interests and investments by different countries into research areas around smart digital agriculture and forestry. This has been especially triggered by an impending population increase to 9.2 billion in 2050. This single pressure will require doubling the global food production in half of the arable land left due to resource depletion, which will impact directly agricultural practices and forestry resources and management.
In order to be successful in overcoming the effects of climate change and to remain competitive and sustainable as a country in the agricultural and forestry sectors, there is a need to acknowledge the challenges and support research and application on the development of new and emerging sensor technologies and their applications. The development of new and emerging technologies applied to sensor networks will help to overcome these issues by basing decision-making on more accurate, meaningful decisions and with high spatial and temporal resolution.
Sensor technology and sensor networks using telemetry systems and the Internet of Things (IoT) are becoming important for research areas that can be applied to digital agriculture and forestry. The key challenge in the production of accurate agricultural and forestry models relies critically on timely provision of high quality geospatially distributed data. This requires the development of complex workflows of real-time sensor calibration, data transfer, and image processing and interpretation, integrated in the optimal and high-performing computational nodes and networks. An example is imaging sensor data where image sensors need to be radiometrically and geometrically calibrated so that each pixel value can be reliably converted into an at-surface reflectance value. Conventional sensing systems deploy time-consuming postprocessing, which depends on specialized skills and specific software, which significantly delays the delivery of the final information to users. The aim of this particular call for papers is focused on systems that provide automated integrated set of tools that can standardize the key components of aerial and ground sensor data processing for empowering industry and academics to focus on innovation. The proposed system will enable near-real-time distribution of monitored aspects of soil—plants and atmospheric factors that allow data mapping and delivery via mobile devices.
The technology proposed can include also a cloud computing framework for sensor calibration, processing, fusion, and classification to reduce the complexity and time required to develop workflows.
Papers submitted based on the following aspects will be highly considered: (i) papers based on the framework to process and fuse ground based sensor networks and metrological information with remotely sensed data from satellites and UAVs to rapidly produce high quality geospatial products that help visualize our environment in extreme detail; (ii) research papers that have used cloud computing on High-Performance Computing (HPC) platforms that enables rapid and automated processing of aerial imagery and ground based sensor network data, streamlining the process from data acquisition to data analysis; and (iii) papers showing the shared knowledge and experience gained through collaboration between industry and academics will centralize development efforts for sensor data processing and visualization algorithms leading to higher quality of geospatial products which will be also considered.
Potential topics include but are not limited to the following:
- New sensor development and application for agriculture and forestry trials
- Sensor network development, data transmission, self-healing, and redundancy considerations
- Remote sensing using satellite, airborne, and unmanned aerial vehicles (UAV) integrated with sensor network technology
- Visualization systems and software platforms developed to integrate sensor networks for decision making processes
- Smart sensors of low costs applicable to agriculture and forestry
- Development of integrated models with sensor networks and applications to smart irrigation for agriculture and forestry environments
Authors can submit their manuscripts through the Manuscript Tracking System athttp://mts.hindawi.com/submit/journals/js/safra/.
|Manuscript Due||Friday, 10 February 2017|
|First Round of Reviews||Friday, 5 May 2017|
|Publication Date||Friday, 30 June 2017|
Lead Guest Editor
- Sigfredo Fuentes, University of Melbourne, Melbourne, Australia
For more information click here: Journal of Sensors