Connectivity Solutions Will Not Help with Autonomous Farming

Over the last two years, agricultural OEMs have been in a rush to get data from their machines to their cloud servers. Claas uses Proemion’s connectivity solution. Agco, Grimme, Krone and a few others bet on AgriRouter running on SAP’s cloud servers. Holmer seems to favour Bosch’s IoT suite. All these solutions have one thing in common: They transfer little data, typically less than 50 data points per second. They top out at 100 data points. They are good for task or order management, asset tracking, fleet management, monitoring of machine health and remote diagnosis.

At least some of these companies seem to think that these connectivity solutions will help them with driver-less or autonomous farming like harvesting, seeding, spraying or fertilising. The wishful thinking would go like this.

The farm machines transfer data to the cloud. The data is used to train machine learning algorithms on extremely powerful compute servers. The model, the result of the learning process, automagically detects whether grains are dirty, maize leaves are dry or the heads of sugar beets are chopped off. Typically, this model runs in the cloud as well, because it requires more compute power than is available on the farm machine or because it adapts to new conditions and self-optimises by continuous learning. This is the way how voice assistants like Alexa and Siri understand spoken language or how medical software recognises cancer cells in MRT scans.

This approach does not work for farm machines. Here is why and what we can do about it.

Bad and Expensive Connectivity

The main problem is that machine learning needs huge amounts of data to yield good results. The connectivity on fields is often bad. We are lucky if we have a 2G connection in rural areas. This is not enough for 50 data points per second. The telematics box must buffer the data and send them later when the connectivity is better.

A good 3G connection can easily handle 100 and more data points. However, we run into another limiting factor: cost. Transferring 100 data points per second with 6 bytes per data point accumulates 1.5 GB of raw data per month. We can bring the data volume down to 300-500 MB by applying some clever compression and compaction techniques. Currently, we pay roughly 50 Euros per machine per month for such a connectivity solution. We do not only pay for the harvesting season of 2-4 months but also for the other 8-10 months.

If we wanted to use the data for machine learning, we would talk about huge amounts of data. We would have to transfer the complete CAN traffic and the video streams from several cameras – just to start with. The cost would be astronomical.

A Fix for Now

If we cannot transfer the data from the machine to the cloud, we move the learning of a model and its application from the cloud to the machine. This is certainly the way to go, but it is not feasible at the moment.

Machine learning requires a lot of compute power to crunch through Terabytes of data and to find an optimal model for image recognition or the continuous, automatic adjustment of the cutting height of maize or corn. They need groups of high-performance computers, which are equipped with graphic cards, where thousands of little cores work together in parallel, and which are connected by high-speed networks.

Such enormous compute power will not be avaible on mobile machines for many years to come. These days, the most powerful computer on a tractor or harvester is the terminal. The Agritechnica-2017 crop of terminals sports at most a quad-core NXP i.MX6 processor (see my post Agritechnica 2017: What’s New for Terminals?). This is the same performance class as the iPhone 4S – released in October 2011! This is ancient technology.

These terminals would not be able to run the models produced by machine learning algorithms, let alone the learning algorithms themselves. Even in 2018, voice assistants record the sentences spoken and send the recording to a powerful cloud server. The cloud server translates the spoken into written sentences and sends the text back to the phone. Even contemporary phones are not powerful enough to perform voice recognition “on board”. They would be powerful enough to recognise language with restricted vocabulary, although the battery would suffer heavily from all the computations. Fortunately, farm machines have bigger batteries than phones.

The fix for now would be to build terminals with processors or system-on-chips (SoCs) comparable to those of current high-end phones. One such SoC is the Renesas R-Car H3 with four 64-bit Cortex-A57 cores, four 64-bit Cortex-A53 cores and one Cortex-R7 core for real-time and safety-critical applications. Another SoC is the NVIDIA Tegra X1 with its 256 GPU cores. These SoCs have a performance similar to mid-range laptops. They are capable of running the models to perform learned tasks and to perform some learning online on the terminal.

Dear terminal manufacturers: When will you wake up and deliver terminals fit for current challenges in autonomous farming?

Obviously, we can also put a high-performance computer for autonomous farming into the cabin. The terminal manufacturers would lose a great business opportunity.

This leaves the problem with transferring the huge amounts of data to the compute servers. We have already found out that connectivity solutions are of no help. So, we must use a different approach.

In a low-tech approach, the terminal saves the data to an SD card or USB drive. The driver regularly uploads the contents of the card or drive to the servers of the agricultural OEM. Less low-tech: When the machine is in the reach of a good WLAN, 4G or 5G Internet connection, the terminal automatically uploads the data. If the machine doesn’t have a good Internet connectivity often enough, the terminal transfers the data to the driver’s phone via Bluetooth or WLAN. When the driver has good Internet connectivity, which is every night, the phone uploads the data.

So, collecting the necessary data isn’t the problem. However, it is important that we identify now which data we will need for the machine learning algorithms. If data is missing, we have to install sensors and cameras on the machine to get our hands on the data. If we don’t get everything in place for this year’s season now, we will lose a whole year.

Dear agricultural OEMs: Put everything (sensors, cameras, software, etc.) in place now – before the season starts – to be able to collect all the data required for machine learning algorithms. Otherwise, your machines will be dumb for another year.

A Better Fix in Five Years

This week, ARM unveiled two new chip designs to make machine learning feasible and fast on embedded devices. From the article on The Verge:

The designs are for the ARM Machine Learning (ML) Processor, which will speed up general AI applications from machine translation to facial recognition; and the ARM Object Detection (OD) Processor, a second-generation design optimized for processing visual data and detecting people and objects. The OD processor is expected to be available to industry customers at the end of this month, while the ML processor design will be available sometime in the middle of the year.

The “industry customers” are chip makers like Renesas, NVIDIA, NXP and Qualcomm. We will see the first hardware samples in 1-2 years. Will we see the first terminals with these artificial-intelligence (AI) chips at Agritechnica 2023?

Scroll to top