Footage of a drone following a drone carrying an hyper-spectral camera on a tomato crop. The project aims to generate machine learning algorithms to recognise early plant disease indications. Melbourne Unmanned Aerial Vehicle System Platform (MUASIP) is a project with the Infrastructure Engineering Department, The … Continue reading Drone following a drone: Early disease detection project for Tomatoes
How much Australian wine growers spend fornew technologies like UAV’s, sensors, robots?
There is very scattered information in this aspect and there are no formal studies about formal spending of consumers, specifically from winegrowers in Australia. Maybe Wine Australia will have some more information in this aspect, specially for funding money spent on research and applications for the wine industry in Australia. The latter, since Australian Grape growers pay a levy every year, which Wine Australia administer for research that benefits the industry. These funds are given in a competitive basis to research institutions. In 2015 there was a specific call called Digital Viticulture, which addressed all these topics. This in a sense can serve as a scope that we are in early stages in the formal application of this technology for commercial purposes. This beside what you can find in advertisements from private companies offering services using drones and robots for management either in the irrigation, fertilization, pest and disease control, etc.
Link to the full interview: CLICK HERE
Dr Sigfredo Fuentes is a plant physiologist and engineer at the University of Melbourne whose end game is the smart farm.
In his mind, it won’t be restricted to a single crop – maybe it’ll grow peaches and nectarines and vines as well – all of which are mapped out digitally.
And at the heart of the operation will be the drone, autonomous and equipped to collect all the data it needs to monitor the entire farm down to the last plant or tree.
And it’s fast – it can cover 2500 hectares in a day, which is about 2000 football fields.
Link to full article: CLICK HERE
There are currently no standardized objective measures to assess beer quality based on the most significant parameters related to the first impression from consumers, which are visual characteristics of foamability, beer color and bubble size. This study describes the development of an affordable and robust robotic beer pourer using low-cost sensors, Arduino® boards, Lego® building blocks and servo motors for prototyping. The RoboBEER is also coupled with video capture capabilities (iPhone 5S) and automatedpost hoc computer vision analysis algorithms to assess different parameters based on foamability, bubble size, alcohol content, temperature, carbon dioxide release and beer color. Results have shown that parameters obtained from different beers by only using the RoboBEER can be used for their classification according to quality and fermentation type. Results were compared to sensory analysis techniques using principal component analysis (PCA) and artificial neural networks (ANN) techniques. The PCA from RoboBEER data explained 73% of variability within the data. From sensory analysis, the PCA explained 67% of the variability and combining RoboBEER and Sensory data, the PCA explained only 59% of data variability. The ANN technique for pattern recognition allowed creating a classification model from the parameters obtained with RoboBEER, achieving 92.4% accuracy in the classification according to quality and fermentation type, which is consistent with the PCA results using data only from RoboBEER. The repeatability and objectivity of beer assessment offered by the RoboBEER could translate into the development of an important practical tool for food scientists, consumers and retail companies to determine differences within beers based on the specific parameters studied.
Enjoy the video and please add some likes in YouTube!