Category: Uncategorized

Call for papers: Journal of Sensors

js

Sensors for Agriculture/Forestry Research and Applications

Call for Papers

Challenges imposed by climate change have pushed forward significant interests and investments by different countries into research areas around smart digital agriculture and forestry. This has been especially triggered by an impending population increase to 9.2 billion in 2050. This single pressure will require doubling the global food production in half of the arable land left due to resource depletion, which will impact directly agricultural practices and forestry resources and management.

In order to be successful in overcoming the effects of climate change and to remain competitive and sustainable as a country in the agricultural and forestry sectors, there is a need to acknowledge the challenges and support research and application on the development of new and emerging sensor technologies and their applications. The development of new and emerging technologies applied to sensor networks will help to overcome these issues by basing decision-making on more accurate, meaningful decisions and with high spatial and temporal resolution.

Sensor technology and sensor networks using telemetry systems and the Internet of Things (IoT) are becoming important for research areas that can be applied to digital agriculture and forestry. The key challenge in the production of accurate agricultural and forestry models relies critically on timely provision of high quality geospatially distributed data. This requires the development of complex workflows of real-time sensor calibration, data transfer, and image processing and interpretation, integrated in the optimal and high-performing computational nodes and networks. An example is imaging sensor data where image sensors need to be radiometrically and geometrically calibrated so that each pixel value can be reliably converted into an at-surface reflectance value. Conventional sensing systems deploy time-consuming postprocessing, which depends on specialized skills and specific software, which significantly delays the delivery of the final information to users. The aim of this particular call for papers is focused on systems that provide automated integrated set of tools that can standardize the key components of aerial and ground sensor data processing for empowering industry and academics to focus on innovation. The proposed system will enable near-real-time distribution of monitored aspects of soil—plants and atmospheric factors that allow data mapping and delivery via mobile devices.

The technology proposed can include also a cloud computing framework for sensor calibration, processing, fusion, and classification to reduce the complexity and time required to develop workflows.

Papers submitted based on the following aspects will be highly considered: (i) papers based on the framework to process and fuse ground based sensor networks and metrological information with remotely sensed data from satellites and UAVs to rapidly produce high quality geospatial products that help visualize our environment in extreme detail; (ii) research papers that have used cloud computing on High-Performance Computing (HPC) platforms that enables rapid and automated processing of aerial imagery and ground based sensor network data, streamlining the process from data acquisition to data analysis; and (iii) papers showing the shared knowledge and experience gained through collaboration between industry and academics will centralize development efforts for sensor data processing and visualization algorithms leading to higher quality of geospatial products which will be also considered.

Potential topics include but are not limited to the following:

  • New sensor development and application for agriculture and forestry trials
  • Sensor network development, data transmission, self-healing, and redundancy considerations
  • Remote sensing using satellite, airborne, and unmanned aerial vehicles (UAV) integrated with sensor network technology
  • Visualization systems and software platforms developed to integrate sensor networks for decision making processes
  • Smart sensors of low costs applicable to agriculture and forestry
  • Development of integrated models with sensor networks and applications to smart irrigation for agriculture and forestry environments

Authors can submit their manuscripts through the Manuscript Tracking System athttp://mts.hindawi.com/submit/journals/js/safra/.

Manuscript Due Friday, 10 February 2017
First Round of Reviews Friday, 5 May 2017
Publication Date Friday, 30 June 2017

Lead Guest Editor

Guest Editors

For more information click here: Journal of Sensors

New Paper: Development of a robotic pourer constructed with ubiquitous materials, open hardware and sensors to assess beer foam quality using computer vision and pattern recognition algorithms: RoboBEER

Abstract

There are currently no standardized objective measures to assess beer quality based on the most significant parameters related to the first impression from consumers, which are visual characteristics of foamability, beer color and bubble size. This study describes the development of an affordable and robust robotic beer pourer using low-cost sensors, Arduino® boards, Lego® building blocks and servo motors for prototyping. The RoboBEER is also coupled with video capture capabilities (iPhone 5S) and automatedpost hoc computer vision analysis algorithms to assess different parameters based on foamability, bubble size, alcohol content, temperature, carbon dioxide release and beer color. Results have shown that parameters obtained from different beers by only using the RoboBEER can be used for their classification according to quality and fermentation type. Results were compared to sensory analysis techniques using principal component analysis (PCA) and artificial neural networks (ANN) techniques. The PCA from RoboBEER data explained 73% of variability within the data. From sensory analysis, the PCA explained 67% of the variability and combining RoboBEER and Sensory data, the PCA explained only 59% of data variability. The ANN technique for pattern recognition allowed creating a classification model from the parameters obtained with RoboBEER, achieving 92.4% accuracy in the classification according to quality and fermentation type, which is consistent with the PCA results using data only from RoboBEER. The repeatability and objectivity of beer assessment offered by the RoboBEER could translate into the development of an important practical tool for food scientists, consumers and retail companies to determine differences within beers based on the specific parameters studied.

Food Desire and Technology Exhibition

MLM_EmailHeader03

Food Desire & Technology Exhibition

Media Lab Melbourne is partnering with the Carlton Connect Initiative on their Absolutely Famished exhibitions series around the future of food. The next exhibition in the series is by MLM’s Pierre Proske and titled Food, Desire & Technology.

The exhibition opening is on Tuesday the 6th September.

Join Media Lab Melbourne artists Pierre Proske and Travis Cox on the 21st September for a round table on Food, AI, Robotics. More details of this event to be announced.

The full program for Absolutely Famished can be found here:

https://www.carltonconnect.com.au/absolutely-famished/

EVENT DETAILS

FOOD DESIRE & TECHNOLOGY

6 September – 4 October 2016

OPENING Tuesday 6 September, 6-8pm

Taste unique local beers and chocolate, experience the interactive Gastro Analytics Machine, and test your emotional responses using the latest remote bio-sensing technology from the laboratory of Dr Sigfredo Fuentes.

 

Food Desire & Technology explores the fragile relation between food, desire and technology. It consists of a series of playful experiments by artist Pierre Proske that engage with current or potential commercial applications of food related technology. From exploring food consumption data analytics, to questions around new gene editing tools such as CRISPR as well as a data visualization of aphrodisiacs, Proske deploys various custom software tools to produce a series of works intended to stimulate discussion around the role of new technologies as mediators of desire in the food industry.

 

The exhibition opening will showcase the Gastro Analytics Machine, an interactive installation that measures your emotions while you sample food or drink. In the future, food retailers will be drawing on emotion recognition and deep learning Artificial Intelligence networks to learn and predict what you like to eat. Sample a range of exotic beers and chocolates and have your response scanned, categorised, archived and visualized.

 

Food Desire & Technology is part of Absolutely Famished, a creative exploration of future food curated by Dr Renee Beale. Underpinned by scientific research it imagines the 22nd century marketplace.

EVENT TICKETS

This is a free event.

Tickets: http://www.carltonconnect.com.au/food-desire-and-technology-exhibition-opening/

Date and time: Tueday 6th of September, 6-8pm

Venue: LAB-14 Gallery, Carlton Connect Initiative, 700 Swanston Street, Carlton

Google Maps: link

 

PIERRE PROSKE FOOD DESIRE & TECHNOLOGY 6 September – 4 October 2016 OPENING Tuesday 6 September, 6-8pm

LAB-14 Gallery invites you to the opening of Food Desire & Technology by Pierre Proske. Taste unique local beers and chocolate, experience the interactive Gastro Analytics Machine, and test your emotional responses using the latest remote bio-sensing technology from the laboratory of Dr Sigfredo Fuentes.

 

Tuesday 6 Septmber, 6-8pm

LAB-14 Gallery, Carlton Connect Initiative,

700 Swanston Street, Carlton

 

Free. Booking essential at

http://www.carltonconnect.com.au/food-desire-and-technology-exhibition-opening/

image002 

Food Desire & Technology explores the fragile relation between food, desire and technology. It consists of a series of playful experiments by artist Pierre Proske that engage with current or potential commercial applications of food related technology. From exploring food consumption data analytics, to questions around new gene editing tools such as CRISPR as well as a data visualization of aphrodisiacs, Proske deploys various custom software tools to produce a series of works intended to stimulate discussion around the role of new technologies as mediators of desire in the food industry.

 

The exhibition opening will showcase the Gastro Analytics Machine, an interactive installation that measures your emotions while you sample food or drink. In the future, food retailers will be drawing on emotion recognition and deep learning Artificial Intelligence networks to learn and predict what you like to eat. Sample a range of exotic beers and chocolates and have your response scanned, categorised, archived and visualized.

 

Food Desire & Technology is part of Absolutely Famished, a creative exploration of future food curated by Dr Renee Beale. Underpinned by scientific research it imagines the 22nd century marketplace.

 

Look forward to seeing you there.

 

Best wishes

Renee

 

Dr Renee Beale

Curator, Absolutely Famished

Creative Community Animator

 

The Carlton Connect Initiative

The University of Melbourne | LAB-14, 700 Swanston Street, Carlton 3053

t: +61 3 8344 6517 | m: +61 404 804 384| e: r.beale@unimelb.edu.au

carltonconnect.com.au | @Carlton_Connect

 

New Article:Development of a robotic and computer vision method to assess foam quality in sparkling wines.

Abstract
Quality assessment of food products and beverages might be performed by the human senses of smell, taste, sound and touch. Likewise, sparkling wines and carbonated beverages are fundamentally assessed by sensory evaluation. Computer vision is an emerging technique that has been applied in the food industry to objectively assist quality and process control. However, publications describing the application of this novel technology to carbonated beverages are scarce, as the methodology requires tailored techniques to address the presence of carbonation and foamability. Here we present a robotic pourer (FIZZeyeRobot), which normalizes the variability of foam and bubble development during pouring into a vessel. It is coupled with video capture to assess several parameters of foam quality, including foamability (the ability of the foam to form) drainability (the ability of the foam to resist drainage) and bubble count and allometry. The foam parameters investigated were analyzed in combination to the wines scores, chemical parameters obtained from laboratory analysis and manual measurements for validation purposes. Results showed that higher quality scores from trained panelists were positively correlated with foam stability and negatively correlated with the velocity of foam dissipation and the height of the collar. Significant correlations were observed between the wine quality measurements of total protein, titratable acidity, pH and foam expansion. The percentage of the wine in the foam was found to promote the formation of smaller bubbles and to reduce foamability, while drainability was negatively correlated to foam stability and positively correlated with the duration of the collar. Finally, wines were grouped according to their foam and bubble characteristics, quality scores and chemical parameters. The technique developed in this study objectively assessed foam characteristics of sparkling wines using image analysis whilst maintaining a cost-effective, fast, repeatable and reliable robotic method. Relationships between wine composition, bubble and foam parameters obtained automatically, might assist in unraveling factors contributing to wine quality and directions for further research.
Fig-1-Diagram-illustrating-the-procedure-used-for-quantifying-foam-parameters-by-using
Pourer

New Article:Assessment of an automated digital method to estimate leaf area index (LAI) in cherry trees

Abstract
A study was carried out during two growing seasons to evaluate the performance of a digital photography method to estimate the LAI (LAID). The trial consisted in 10 ‘Bing’ and 10 ‘Sweetheart’ trees where actual LAI (LAIA) was obtained by allometric relations. Estimations of LAID were obtained by the batch processing of images of the canopy of the same trees which were obtained by a conventional digital RGB camera. Comparisons of averages LAIA and LAID resulted in a good level of agreement for ‘Sweetheart’ for the two growing seasons (mean absolute percent error: MAE% = 10.4 %). For ‘Bing’, LAID was accurate in the first growing season (MAE% = 17.7 %) but underestimated by 44% (MAE%) in the second growing season, presumably due to differences observed in the clumping index and the light extinction coefficient. Results evidenced the robustness of this simple method for determining the LAI of cherry trees.
CherryLAI