New Interview: The Vineyard of The Future. Italian Blog


How much Australian wine growers spend fornew technologies like UAV’s, sensors, robots?

There is very scattered information in this aspect and there are no formal studies about formal spending of consumers, specifically from winegrowers in Australia. Maybe Wine Australia will have some more information in this aspect, specially for funding money spent on research and applications for the wine industry in Australia. The latter, since Australian Grape growers pay a levy every year, which Wine Australia administer for research that benefits the industry. These funds are given in a competitive basis to research institutions. In 2015 there was a specific call called Digital Viticulture, which addressed all these topics. This in a sense can serve as a scope that we are in early stages in the formal application of this technology for commercial purposes. This beside what you can find in advertisements from private companies offering services using drones and robots for management either in the irrigation, fertilization, pest and disease control, etc.

Link to the full interview: CLICK HERE

New Interview: Digital Vineyards: building technology so farms can think for themselves


Dr Sigfredo Fuentes is a plant physiologist and engineer at the University of Melbourne whose end game is the smart farm.

In his mind, it won’t be restricted to a single crop – maybe it’ll grow peaches and nectarines and vines as well – all of which are mapped out digitally.

And at the heart of the operation will be the drone, autonomous and equipped to collect all the data it needs to monitor the entire farm down to the last plant or tree.

And it’s fast – it can cover 2500 hectares in a day, which is about 2000 football fields.

Link to full article: CLICK HERE

Call for papers: Journal of Sensors


Sensors for Agriculture/Forestry Research and Applications

Call for Papers

Challenges imposed by climate change have pushed forward significant interests and investments by different countries into research areas around smart digital agriculture and forestry. This has been especially triggered by an impending population increase to 9.2 billion in 2050. This single pressure will require doubling the global food production in half of the arable land left due to resource depletion, which will impact directly agricultural practices and forestry resources and management.

In order to be successful in overcoming the effects of climate change and to remain competitive and sustainable as a country in the agricultural and forestry sectors, there is a need to acknowledge the challenges and support research and application on the development of new and emerging sensor technologies and their applications. The development of new and emerging technologies applied to sensor networks will help to overcome these issues by basing decision-making on more accurate, meaningful decisions and with high spatial and temporal resolution.

Sensor technology and sensor networks using telemetry systems and the Internet of Things (IoT) are becoming important for research areas that can be applied to digital agriculture and forestry. The key challenge in the production of accurate agricultural and forestry models relies critically on timely provision of high quality geospatially distributed data. This requires the development of complex workflows of real-time sensor calibration, data transfer, and image processing and interpretation, integrated in the optimal and high-performing computational nodes and networks. An example is imaging sensor data where image sensors need to be radiometrically and geometrically calibrated so that each pixel value can be reliably converted into an at-surface reflectance value. Conventional sensing systems deploy time-consuming postprocessing, which depends on specialized skills and specific software, which significantly delays the delivery of the final information to users. The aim of this particular call for papers is focused on systems that provide automated integrated set of tools that can standardize the key components of aerial and ground sensor data processing for empowering industry and academics to focus on innovation. The proposed system will enable near-real-time distribution of monitored aspects of soil—plants and atmospheric factors that allow data mapping and delivery via mobile devices.

The technology proposed can include also a cloud computing framework for sensor calibration, processing, fusion, and classification to reduce the complexity and time required to develop workflows.

Papers submitted based on the following aspects will be highly considered: (i) papers based on the framework to process and fuse ground based sensor networks and metrological information with remotely sensed data from satellites and UAVs to rapidly produce high quality geospatial products that help visualize our environment in extreme detail; (ii) research papers that have used cloud computing on High-Performance Computing (HPC) platforms that enables rapid and automated processing of aerial imagery and ground based sensor network data, streamlining the process from data acquisition to data analysis; and (iii) papers showing the shared knowledge and experience gained through collaboration between industry and academics will centralize development efforts for sensor data processing and visualization algorithms leading to higher quality of geospatial products which will be also considered.

Potential topics include but are not limited to the following:

  • New sensor development and application for agriculture and forestry trials
  • Sensor network development, data transmission, self-healing, and redundancy considerations
  • Remote sensing using satellite, airborne, and unmanned aerial vehicles (UAV) integrated with sensor network technology
  • Visualization systems and software platforms developed to integrate sensor networks for decision making processes
  • Smart sensors of low costs applicable to agriculture and forestry
  • Development of integrated models with sensor networks and applications to smart irrigation for agriculture and forestry environments

Authors can submit their manuscripts through the Manuscript Tracking System at

Manuscript Due Friday, 10 February 2017
First Round of Reviews Friday, 5 May 2017
Publication Date Friday, 30 June 2017

Lead Guest Editor

Guest Editors

For more information click here: Journal of Sensors

New Paper: Development of a robotic pourer constructed with ubiquitous materials, open hardware and sensors to assess beer foam quality using computer vision and pattern recognition algorithms: RoboBEER


There are currently no standardized objective measures to assess beer quality based on the most significant parameters related to the first impression from consumers, which are visual characteristics of foamability, beer color and bubble size. This study describes the development of an affordable and robust robotic beer pourer using low-cost sensors, Arduino® boards, Lego® building blocks and servo motors for prototyping. The RoboBEER is also coupled with video capture capabilities (iPhone 5S) and automatedpost hoc computer vision analysis algorithms to assess different parameters based on foamability, bubble size, alcohol content, temperature, carbon dioxide release and beer color. Results have shown that parameters obtained from different beers by only using the RoboBEER can be used for their classification according to quality and fermentation type. Results were compared to sensory analysis techniques using principal component analysis (PCA) and artificial neural networks (ANN) techniques. The PCA from RoboBEER data explained 73% of variability within the data. From sensory analysis, the PCA explained 67% of the variability and combining RoboBEER and Sensory data, the PCA explained only 59% of data variability. The ANN technique for pattern recognition allowed creating a classification model from the parameters obtained with RoboBEER, achieving 92.4% accuracy in the classification according to quality and fermentation type, which is consistent with the PCA results using data only from RoboBEER. The repeatability and objectivity of beer assessment offered by the RoboBEER could translate into the development of an important practical tool for food scientists, consumers and retail companies to determine differences within beers based on the specific parameters studied.

New Article:Development of a robotic and computer vision method to assess foam quality in sparkling wines.

Quality assessment of food products and beverages might be performed by the human senses of smell, taste, sound and touch. Likewise, sparkling wines and carbonated beverages are fundamentally assessed by sensory evaluation. Computer vision is an emerging technique that has been applied in the food industry to objectively assist quality and process control. However, publications describing the application of this novel technology to carbonated beverages are scarce, as the methodology requires tailored techniques to address the presence of carbonation and foamability. Here we present a robotic pourer (FIZZeyeRobot), which normalizes the variability of foam and bubble development during pouring into a vessel. It is coupled with video capture to assess several parameters of foam quality, including foamability (the ability of the foam to form) drainability (the ability of the foam to resist drainage) and bubble count and allometry. The foam parameters investigated were analyzed in combination to the wines scores, chemical parameters obtained from laboratory analysis and manual measurements for validation purposes. Results showed that higher quality scores from trained panelists were positively correlated with foam stability and negatively correlated with the velocity of foam dissipation and the height of the collar. Significant correlations were observed between the wine quality measurements of total protein, titratable acidity, pH and foam expansion. The percentage of the wine in the foam was found to promote the formation of smaller bubbles and to reduce foamability, while drainability was negatively correlated to foam stability and positively correlated with the duration of the collar. Finally, wines were grouped according to their foam and bubble characteristics, quality scores and chemical parameters. The technique developed in this study objectively assessed foam characteristics of sparkling wines using image analysis whilst maintaining a cost-effective, fast, repeatable and reliable robotic method. Relationships between wine composition, bubble and foam parameters obtained automatically, might assist in unraveling factors contributing to wine quality and directions for further research.

New Article:Assessment of an automated digital method to estimate leaf area index (LAI) in cherry trees

A study was carried out during two growing seasons to evaluate the performance of a digital photography method to estimate the LAI (LAID). The trial consisted in 10 ‘Bing’ and 10 ‘Sweetheart’ trees where actual LAI (LAIA) was obtained by allometric relations. Estimations of LAID were obtained by the batch processing of images of the canopy of the same trees which were obtained by a conventional digital RGB camera. Comparisons of averages LAIA and LAID resulted in a good level of agreement for ‘Sweetheart’ for the two growing seasons (mean absolute percent error: MAE% = 10.4 %). For ‘Bing’, LAID was accurate in the first growing season (MAE% = 17.7 %) but underestimated by 44% (MAE%) in the second growing season, presumably due to differences observed in the clumping index and the light extinction coefficient. Results evidenced the robustness of this simple method for determining the LAI of cherry trees.