Smart Machines Require Smart User Interfaces
A sugar beet harvester costs roughly €500,000. At 6-7 kph, these machines dig out six beets in parallel, cut off the leaves, clean the beets in a rotating sieve and store up to 30 tons in the rear bunker. They can do U-turns automatically. They can do a crab walk to mitigate soil compression. They can tilt around their roll axis to compensate for slopes. In short, harvesters are technical wonders.
Some farmers join forces and buy the harvesters together. They often hire drivers for the six-week harvest. Many farmers rent the harvesters together with drivers from agencies. The drivers are badly paid. They work 12-16 hour shifts at day or at night. When they return for the next harvest, they have forgotten most from the previous harvest. Most drivers stay eternal beginners. Very few become experts.
When the beets reach the sugar factories, human evaluators estimate the percentage of sugar in the beets. The farmers get paid for this percentage. The more dirt adheres to the beets and the more leaves are left over, the lower the sugar content is estimated. If the beets are injured when cutting the leaves, they leak sugar. Growing sugar beets is a low-margin business.
The drivers play “the” crucial role in achieving reasonable margins. They look through the front window and judge whether the shovels dig deep enough and whether the knives cut the leaves just above the beets. They also look at the video from the rotating sieve to decide whether the beets are too dirty. Additionally, they must steer the harvester over the field, while minimising the diesel consumption. And these are only some of the drivers’ tasks.
If the drivers notice that something is off (e.g., dirty or injured beets), they must know which knobs to turn or which buttons to press. Of course, all the knobs and buttons are nicely realised in software on multi-touch displays these days. The meaningful arrangement on a harvester schematic makes clear which four knobs are for lifting the beets out of the ground or which five knobs for cleaning the beets. However, the user interface doesn’t help the tired, badly trained and underpaid drivers which knob to turn how much in which direction.
Such user interfaces fail their main duty. They are dumb. They don’t make the daily work of their users easier. They don’t make users more productive. They don’t support users in doing a better job.
This is a shame! Machines are very expensive. A new sugar beet harvester costs half a million Euros. Its user interface consists of two driver terminals (multi-touch display computers), a joystick, one or two rotary knobs and 20-30 hard buttons. Harvesting is not possible without the terminals. Each terminal costs less than €1,500.
Manufacturers charge €500,000 for a harvester and pay €3,000 for two terminals. Sure, the true value of the terminals comes from the software. Developing the terminal software – the user interface – for the first release took three developers two years. These three developers can maintain and extend the terminal software. In comparison, developing and manufacturing the harvester takes 100 and more people.
Given how crucial the user interface is for the commercial success of a machine, you – the manufacturer of industrial machinery – will invest significantly more time and money into smart user interfaces. I have five guidelines how you can get the highest return on your investment. Join me – the specialist in smart user interfaces – in my mission.
Guideline 1: Focus on What Customers Want to Buy
Traditionally, product development teams focus on what users want or need. This is futile, because users rarely know. Instead, these teams should get a lot better in predicting what customers will buy.
Hired harvester drivers – the users – don’t really care, how many sugar beets are left in the ground, how dirty the beets are, whether the beets are injured and how much diesel the harvesters guzzle. The agencies put the drivers on tight schedules for harvesting the fields. If the drivers don’t keep the schedules, they are paid less.
Unsurprisingly, farmers – the customers – make for the best drivers, as their incomes depend directly on the harvest. They make sure that the beets come out in the best condition and that the harvester consumes the least diesel. They tinker around with all the knobs and buttons until they are happy with the result. Expert drivers can achieve considerably higher yields.
What will customers buy? The answer should be obvious by now. They will buy machines with user interfaces that make every user more productive every day. Such smart user interfaces turn average users into expert users and enable expert users to find more effective and more efficient ways to interact with machines.
Our job is to understand why expert users interact with machines in certain ways in certain contexts. Harvester drivers have many different reasons. They observe that many beets are left in the ground, because the soil is clay. So, they make the shovels dig deeper. They notice that many beets have head injuries, because the knives cut too low. So, they lift up the knives. They see that the beets are too dirty, because it has rained the night before. So, they make the sieves rotate faster.
Users observe their environment, deduce what to do and how to do it, and tell the machine which operations to perform in which order. By simplifying and automating this human-machine interaction, you will turn dumb into smart user interfaces.
Guideline 2: Give Machines All Information Needed
A lot of information relevant to the harvest is available somewhere, but not on the harvester. The exact position of the beet rows is known from sowing. Satellite imagery and detailed maps show the borders of the fields. The tracks taken by the harvesters on the fields have been recorded for several years. The diesel consumption for the tracks is known as well. Smart user interfaces combine all this information to calculate the optimal route for a field minimising the diesel consumption – very much like the navigation system in your car.
Ordinary user interfaces draw a blank, because they lack the necessary information. This lack has many reasons. The seeder comes from a different manufacturer than the harvester. The seeder manufacturer stores the beet row positions in its own data silo, but doesn’t share the positions with the harvester manufacturer. The harvester manufacturer regards the satellite imagery and detailed maps as too costly. It may not have started recording the tracks yet or is afraid of alleged privacy concerns. It is the driver’s job to compensate for this lack of information.
Harvesters generate hundreds of sensor values per second. Chances are good that the inference component of the user interface can deduce valuable information and simplify the driver-machine interaction. So, the information is already available on the harvester and only needs some processing (in software) to become useful.
The team developing the user interface may find out that some sensors are missing to automate interactions. The sugar beet harvester, for example, lacks cameras trained on the beets just after coming out of the ground. It could also do with a reliable sensor for the fill level of the bunker. The user interface team can only improve the interaction if it has the power to demand the integration of new sensors in the machine.
You collect the data from many sources like cloud services, your machines and the machines of other manufacturers. You identify the required data and start collecting it long before you can leverage it. The more data statistical analysis, machine learning and deep learning are fed the more valuable and actionable the inferred information is. The user-machine interaction becomes simpler and more often automated. In short, well-connected user interfaces thrive, isolated ones fail.
Guideline 3: Build AI into User Interfaces
Deep learning, the AI technology enabling self-driving vehicles, is crucial to a much improved user-machine interaction and hence to the success of machines. Harvesters are no exception. The cameras, which are looking at the beets just coming out of the ground, stream their video to special software for image recognition. The special software, a so-called deep neural network, determines how badly the beets are injured and how dirty they are. The user interface displays the injury and dirt level, and tells the front implement how to adapt the shovels and knives.
Making image recognition work requires forward-looking planning, a lot of work and a powerful computer on the harvester. You need to shoot many hours of video at least one season before productising image recognition. Expert drivers and the evaluators from the sugar factories classify the injury and dirt level in thousands of images. Specialists train a deep neural network with the “classified” images. The trained neural networks run on computers with powerful graphical or neural processing units on the harvesters. The user interface typically runs on the same computer for simple and fast communication.
Deep learning can also be used to find the optimal route on the field, to detect persons or objects in the way of the harvester, or to hear when the engine needs maintenance. Simpler methods like statistical analysis and machine learning can detect and harness relationships between sensor values and user interactions with the machine. These relationships look like this: If a certain combination of sensor values occurs, the user is highly likely to control the machine in a certain way.
All these methods simplify and automate user-machine interaction. Smart user interfaces exploit all these methods and especially deep learning, as it enables a leap forward in automation.
Guideline 4: Let Your Best People Develop the User Interface
A user interface (UI) is a lot more than flashy knobs, graphs, dialogs and windows. It is a complex computer system – called the UI system – communicating with users, machine sensors and actuators, cloud services and databases. The user interacts with the machine through touch displays, joysticks, rotary knobs, hard buttons, voice and gestures. The UI system exchanges information with other systems through cloud services. It performs image, voice or gesture recognition. It sifts through thousands of sensor values per second, filters and analyses the important ones, and presents them to the user in an actionable way.
The best UI systems act on all this data mostly automatic. Users shift to supervisory roles similar to pilots, who will take over in critical situations. The handover from automated to manual mode is critical and tricky.
UI systems are complex, as they cooperate with many other systems and with human beings, the most complex systems of all. Junior developers and interns cannot build the user interface, let alone the whole system. Even many senior developers are not up to this task. Neither the sales people, the marketing people, the UI designers, the UI developers nor the developers of the machine’s hardware and software should decide about the best human-machine interaction. They should decide altogether best guided by people with a holistic view of the system. System architects tend to be excellent guides.
Such experts are rare. So are experienced designers and developers. Typically, you do not have these people in-house or not enough of them. Hence, you outsource UI development partially or even completely. Professional service companies have little interest in transferring their knowledge to your in-house teams. The less the in-house developers know, the longer you need these companies and the longer they will charge you.
When you work with outside people, work with technical experts who like to share their knowledge and who deeply understand the complex interactions of a UI system. As they make your developers better a little every day, they make themselves superfluous over time. They give you the control over your own fate.
Guideline 5: Focus on Value, Not on Costs
Technical experts should make the big bets during product development, as they most likely get them right. For example, they decide to install a high-quality and robust camera system and to deploy image recognition software on the UI system to determine the injury and dirt level of the sugar beets. The computer running the UI system must be powerful with a dedicated neural processing unit. The development costs are significant.
Such bets have high costs written all over them. Many manufacturers shy away from these high costs. They bargain hard to get 3% off the €3,000 for the run-of-the-mill terminal that needs an upgrade in five years at the latest. They hire cheap developers producing software that needs a complete rewrite every five years. Despite machines selling for six and seven figures, they only see risks and costs.
Tesla saw opportunity and value instead. They were the first to put a super-computer in their cars. This super-computer is ideally suited for being in charge of semi-autonomous driving and the infotainment system. Tesla were the first to update the car’s software automatically and regularly. This enabled them to improve their cars at breakneck speed.
Tesla’s bet paid off. They went from nearly bust to one of the most valuable companies in the world. They are years ahead of their competition. Where Tesla was driven by the chance to create high value, the other car makers were only driven by the fear of high costs. Nearly 10 years too late, they make much higher investments than Tesla just to catch up.
A clever investment in the right software, hardware and people at the right time makes all the difference for the success of a machine. User interfaces are the best place for such investments, because they provide massive leverage. By simplifying and automating user interaction, they make users more productive and enable users to do new things.
People buy cars because of their infotainment systems – or they don’t. Manufacturers of industrial machinery slowly acknowledge that user interfaces are crucial for their customers’ buying decision. Hence, they spend millions of euros on AI technologies like deep and machine learning to make their user interfaces smarter. Customers will honour the considerably higher value from smart user interfaces by buying more machines at higher prices.