I have a greenhouse. I grow chillies in said greenhouse. I would like to use technology to improve my yield.
Using a Raspberry Pi, some cheap electronics and some Python I've started to write the telemetry software that will allow me to make better choices. I've been thinking about how far I can take this.
One such area in the future that will provide the input to get closer to the entire system being closed is plant growth monitoring.
So we need something to measure and as I see it, plants have two phases which can be measured
Phase 1: From Seed
Measuring germination would be a case of detecting something appearing from nothing, the measurement being time from planting to germination. Time should be pretty short (1-4 weeks) and any more is a failure.
Another useful metric might be the number of seeds which germinate from a given number planted.
Phase 2: From Germination
This is where it get interesting as this feedback can be fed into the cycle and adjustments made to improve. Useful metrics might be:
- Height of plant
- Physiological age: number of leaves
- Size of leaves
- Number of flowers/fruits
- Changes in color
- Weight of plant (with some compensation for watering)
So we can technically measure the first phase but unless we are doing several rounds of germination or using the data to feed back into the next year so I'll be concentrating on the second phase as it fits the brief of actions based on feedback to improve yield.
So we know what we want to measure, how do we measure it?
The simplest method of gauging plant height has been used for a very long time and is called an auxanometer. This is essentially a string is tied to the top of the plant, it goes over a pulley to a weight and as the plant grows the weight lowers. This is linear motion, which is easy to measure.
The other option is image processing, which we'll discuss as part of...
Leaf count, Leaf Size, Flower/Fruit Count, Color Changes
Without manual process, these are much harder to read. Ideas could be a camera system which does image processing^, LIDAR which creates 3D models, infrared or some combination such as XBox Kinect systems^.
An interesting option is plant weight. Plants will obviously increase in weight as they increase in size. There are a couple gotchas in that plants need watered and various states of being watered will have different weights and that offset will need to be accounted for. The other is that weight will take into account things on the plant; moisture, pests, dirt and there would be no distinction between leaf, stem, flower or fruit. The latter being a lesser concern.
Things like structure of the plant and it's leaves and growing direction can be handy for detecting disease, lack of water, excess heat. Then being able to detect and possibly identify pests could lead to suggested or automated treatments.
A Way Forward
There are quick wins, the auxanometer and weighing platform but ultimately if we don't want multiple machines around each plant the efficient solution is image/lidar/infrared processing. It should do it all in a very compact way. This of course is not a quick win solution and will require quite advanced software.
So the plan, if I go forward with it will be to use a camera (perhaps an XBox Kinect) and write software to process height, leaf count & size, fruit & flower production and color.
Steps (quick wins first)
- Get the camera setup attached to my power grid and adjust as necessary
- Add software to tell plants apart even when they are moved, perhaps an identifying mark on the plantpot such as a number or even barcode, allowing us to encode things like plant type right there on the plant.
- Expand software to detect changes in plant color
- Expand software to detect height
- Expand software and hardware to detect leaf/flower size/count
The last one may require hardware changes as a static camera and a static plant could mean that leafs/flowers could be hidden from the camera by the plant itself. This hardware expansion may also by useful for rotating plants away from the sun to help them build good stems.
What to do with all this data?
Very quickly from a little data we can start making the lives of our plants better, autonomously and suggesting manual remedies for common problems:
Under-watered or overheated plants tend to lose structural integrity in their leaves. We can water them, or open vents to cool the greenhouse.
Pests or disease will discolour the plant, some pests can be removed with a spray of water, others with natural pesticides (like a rhubarb/chilli tea) and others could suggest a change in gardening practices like adding more plants that encourage ladybirds to eat pest like aphids.
Lack of growth, of flowers could suggest that we are over watering or under feeding.
Flowers which don't reach maturity could suggest improper feeding or undesirable humidity levels.
Growth direction could indicate that plants need to be turned more regularly so that they have to reach towards the sun, strengthening their stems.
Struggling plants might suggest greenhouse changes such as a better base, insulation or the introduction of a thermal mass to limit temperature spikes.
And the more time we can stop thinking about common problems, the more we can start thinking about how to increase yield volume and quality.
^1: Lin, Ta-Te & Lai, Tsung-Cheng & Liu, Ting-Yu & Yeh, Yu-Hui & Liu, Chang-Chih & Chung, Wei-Chang. (2019). An Automatic Vision-Based Plant Growth Measurement System for Leafy Vegetables.
^2: Deep learning for image-based prediction of plant growth in city farms Iljazi, J. (Author). 25 Sep 2017
^3: Hu, Yang & Wang, Le & Xiang, Lirong & Wu, Qian & Jiang, Huanyu. (2018). Automatic Non-Destructive Growth Measurement of Leafy Vegetables Based on Kinect. Sensors. 18. 806. 10.3390/s18030806.
^4: Chunlei Xia, Longtan Wang, Bu-Keun Chung, and Jang-Myung Lee, et al. (2015). In Situ 3D Segmentation of Individual Plant Leaves Using
a RGB-D Camera for Agricultural Automation. Open Access Sensors ISSN 1424-8220
^5: Jorge Martinez-Guanter, Ángela Ribeiro,et al. (2019). Low-Cost Three-Dimensional Modeling of Crop Plants. Sensors 2019, 19, 2883; doi:10.3390/s19132883
Top comments (0)