Which system-on-chip is best suited for your Qt embedded system? Should you build a custom Linux system with Yocto, use a container OS or a desktop Linux? Should you use Qt Commercial or Qt LGPLv3? How will the system be updated? What are the operating conditions of the system? How does the system communicate with sensors, ECUs and the cloud?
This is only a small selection of the 50+ questions in this post. You best tackle these questions early in the project. Fixing wrong decisions becomes exponentially harder with the duration of the project. Your decisions can make or break a project.
Getting basic architecture decisions right at the beginning of a project saves us a lot of time and money later. It may even make or break a project.
- Harvester cabins are commonly stuffed with 3, 4 or more terminals. Each terminal – a display with a computer integrated – easily costs 1500-2000 Euros. The harvester OEM can replace each terminal by a “dumb” display (500-700 Euros each) and have a powerful computer like the NVIDIA Jetson or the NXP i.MX8 (500-800 Euros) drive the displays. They save good money on the hardware. Each terminal becomes an application on the central computer. They develop only a single system instead of four. But: the system would need different connectors, if not different cables, to adapt to the high data rates for transfering the display contents.
- An e-bike startup built the bike computer with Qt Commercial, Debian for ARM and a cheap Allwinner SoC that didn’t have a CAN port. For the system bring-up, they had to hire a Debian specialist who worked on multiple projects. Allwinner didn’t help with getting Debian running on their SoC, as their focus is Android. The startup also had to patch the CAN bus through a UART – a hardware modification. The system bring-up was delayed by half a year. This delay was one of several reasons why the startup went bust.
- One home applicance maker chose Qt Commercial and payed millions in license fees. They were too afraid of updating Qt on their appliances. Another home appliance maker chose Qt LGPLv3 and saved millions in license fees. They found a way to let users update Qt on their appliances without risking to be sued.
- A team decided to use QML and Qt for Python (PySide2) on an embedded Linux system, because the developers were familiar with Python but not with C++. With the release less than a month away, they found out that PySide2 didn’t work on embedded Linux for Qt 5.x. As the system was small, rewriting the Python code in C++ didn’t delay the release too much.
In the rest of this post, I list lots of questions that you should ask when starting the development of a Qt embedded system. The answer to these questions may vary from project to project. The two home appliance makers choosing different Qt licenses is a good example. Each of them had their good reasons.
The answers may also depend on the time. In 2017, replacing the four terminals in the harvester by four dumb displays plus a computer would have required hardware modifications for the high-speed connectors. In 2021 or 2022, you’ll probably find standard hardware with these high-speed connectors.
I divided the questions into seven areas: Applications and Window Manager, Operating Systems, System-on-Chips, Operating Conditions, Cloud, Controller and Sensors. I plan to write one post for each area and four cases studies from my own experience. The cases studies will cover an in-car infotainment system, a driver terminal of a harvester, an operator terminal of an industrial machine and a medical device.
Trying to answer the questions will increase the chances to succeed with your current or next Qt embedded system. Here is the list of 50+ questions. Feel free to send me your questions.
Architecture of Qt Embedded Systems
Typical examples of Qt embedded systems are
- the operator terminals of agricultural, construction and industrial machines,
- the display computers of e-bikes, TVs, printers, home appliances and medical devices,
- in-car infotainment systems and
- in-flight entertainment systems.
They share this high-level architecture.
Applications and Window Manager
A Qt embedded system runs one or more applications. The applications have a human-machine interface (HMI). The window manager composes the application HMIs into one system HMI.
Many systems show only one application. They don’t need a window manager. This may be true for the first generation of your system, but not for the second or third. You want to mirror the screen of your machine terminal via VNC to the office PC of a support person sitting half way around the world. With a window manager, it’s not much more than adding the VNC package to the Yocto build. Without a window manager, a few days of programming are involved.
Another reason for a multi-application system is stability. The single application of a machine terminal crashes, because there is a bug in the settings part of the application. If the main functionality and settings functionality are in two separate applications (processes), the main application wouldn’t be affected by the crash of the settings application. Machine operation could continue nearly unaffected.
In-car infotainment systems are the prime examples of multi-application systems. They come with some standard applications like radio, media player, navigation and settings. They also allow third-party apps or apps from smartphones over VNC (Apple CarPlay, Google Auto, MirrorLink). A special-purpose window manager is a must for these systems. General-purpose window managers like X11 are rarely the first choice, because they come with a huge overhead and are hard to control.
As an architect, you must answer these questions:
- How many applications run at the same time?
- Which window manager – eglfs, Wayland, X11 – is best suited for your system?
- What are the graphical requirements of the system? 3D? Frequently changing dynamic content? Animations? Overlays (HMI on top of video)? Compositor for multiple windows?
- Which input methods – touch, hard keys, joystick, on-screen keyboard, speech, etc. – does the system require?
- How many screen resolutions and formats does the system have to support?
- Should you use QML or QWidgets for the GUI parts of the applications?
- Should you use C++ or Python for the non-GUI parts of the applications?
- How much RAM and flash memory do the applications need?
- Which Qt license (Commercial, LGPLv3, GPLv3) should you choose?
Qt embedded systems run on Linux, real-time operating systems (RTOSs) like QNX, VxWorks and FreeRTOS and even bare metal. By far the most common operating system (OS) is Linux custom-built with Yocto. Desktop Windows and desktop Linux are widely used in operator terminals for production lines and industrial machinery. This variety raises a couple of questions.
- Which OS shall you choose: bare metal, RTOS, Linux or Windows?
- Shall you build Linux with Yocto or Buildroot yourself, buy a “commercial” Linux from Siemens Mentor or Windriver or shall you use a desktop Linux (e.g., Ubuntu)?
- How could a container OS like Torizon OS or Balena OS speed up development? What are the downsides?
- On which other OSs do the embedded Qt applications have to run (e.g., mobile, desktop)?
- How fast does the system have to boot?
- How are system updates performed? From USB drive? Over-the-Air (OTA)? Over Ethernet or CAN?
- What are the real-time requirements?
- What are the safety requirements? Which safety standards does the system have to meet?
- What are the security requirements?
Toradex’s Colibri family of Computer-on-Modules (CoMs) has eight members. The CoMs have one to four ARM Cortex-A5, Cortex-A7, Cortex-A9 or Cortex-A35 microprocessor cores. Some have an additional ARM Cortex-M4 microcontroller. Half of the CoMs have a 3D GPU for OpenGL graphics acceleration. The CoMs differ in RAM, flash and peripherals.
These eight CoMs are only a tiny selection of available system-on-chips (SoCs) for Qt embedded systems. You can choose from microcontroller and microprocessor SoCs manufactured by NXP, Texas Instruments, NVIDIA, Qualcomm, Renesas, STM, Allwinner, Mediatek and many other companies. And you shouldn’t forget the Raspberry Pi SoCs.
The answers to questions in the other sections narrow down the SoC candidates. The following questions help you to reduce the candidate list even further.
- How compute-intensive or graphic-intensive are the jobs performed by the applications?
- Will the system perform inferencing on deep neural networks (e.g., for image or speech recognition)?
- How much do the SoCs cost?
- Is the system powered from battery or mains? What power modes does the SoC provide?
- Which safety-critical jobs should you assign to the microcontroller of a heterogenous SoC?
- What is the hardware support for the execution of safety-critical and security-critical jobs (e.g., TrustZone, secure boot, secure enclave)?
- How long will the SoC be supported by the manufacturer?
- How much room does the SoC have for additional features over the next 5, 10 or even 15 years?
- Is the SoC a solid basis for a product family?
Direct sunlight can wash out the contents on a display in a car, lorry, tractor or construction machine. Using a high-luminosity display or shielding the display from sunlight helps. A wide viewing angle and few reflections are other important attributes of good displays.
1000 horse-power agricultural and construction machines produce noticeable vibrations. Standard RJ45 (Ethernet), DSUB-9 (CAN) and USB plugs would fall out of their sockets pretty soon.
Consumer devices work properly in a temperature range from 0°C to 40°C. A range from 0°C to 70°C is often OK for indoor machinery. Automotive-grade Qt embedded systems must work in temperatures from -40°C to 85°C.
Scales for weighing meat or cheese must withstand high-pressure jets of water with aggressive cleaning substances. Bike computers must resist dust, dirt and water. When you make devices water-tight and dust-tight, you must take care that the SoC doesn’t overheat. The implication may be that you must use a less powerful SoC.
- What are the lighting conditions?
- In which temperature range does the system operate?
- How high is the exposure of the system to water and dust?
- What are the thermal implications of protecting the system against water and dust?
- How strong does the system vibrate?
- Do the operators wear gloves?
External Collaborators of Qt Embedded Systems
Qt embedded systems rarely work in isolation these days. They are connected to the cloud for remote support, monitoring and control.
Qt embedded systems receive inputs from image, LIDAR, air-quality, traffic and numerous other sensors. Systems with very powerful SoCs perform image recognition, speech recognition or sensor data evaluation on board. Less powerful systems offload these tasks to compute servers in the cloud.
Controllers regulate the heat of an oven, the rotational speed of a turbine, the speed of a car or the volume of a TV. They send back the current temperature, speed or volume to the system. The communication happens over CAN or Ethernet networks, serial connections like RS-232 and RS-485, Zigbee, LoRa and many other ways.
The infotainment systems in cars or the operator terminals in tractors, harvesters and cranes often double up as IoT gateways. They collect data from multiple sensors and from the machine, filter the relevant data and send them to the cloud. Cloud applications log the data for later evaluation, detect problems early, diagnose the problems or send alerts to technicians.
A VNC server lets systems mirror their screen contents to computers anywhere in the world. Support technicians see on their PC monitors exactly the same what harvester drivers see on their terminals. Remote support becomes so much easier than over the phone.
Instead of sending the screen contents, the Qt embedded system could send the data over the Internet to office PCs, phones and tablets – for technical support, for fleet management, for accounting, for remote supervision, for designing wooden or metal profiles and for many other purposes.
You can run the Qt application from the embedded system on the remote computers with little changes and add some more features. Qt makes this easy as it is cross-platform. Although this is the most time-efficient option, many companies re-implement the embedded Qt application using a web framework like Angular, React and Electron.
Web frameworks save you from any deployment hassles. Compiling the Qt application into WebAssembly may give you the best of both worlds on PCs and possibly on ever more powerful phones and tablets: no re-implementation and no deployment overhead.
- Which protocol (e.g., MQTT, CoAP, OPC UA) does the sytem use to communicate with the cloud?
- Does the system use a wired or wireless Internet connection (e.g., 3G, LTE, LoRa, NB-IoT, LAN)?
- How does the system recover from periods without (wireless) Internet connection?
- How does the system ensure end-to-end encryption for the communication?
- Which OTA-update solution does the system use (e.g., Mender, RAUC, SWUpdate)?
- How can the system facilitate remote support, remote diagnostics and remote supervision (e.g., digital twin, VNC, TeamViewer)?
- Which computational tasks does the system offload to the cloud (e.g., image and speech recognition, statistical evaluations)?
- How can you best reuse the Qt applications from the embedded sytem on other platforms like PCs, phones and tablets?
The controller in vehicles comprises several electronic control units (ECUs). There are ECUs for the steering, engine, brakes, cutting unit, air condition, joystick, camera system, telematics unit, etc. The driver terminal communicates with the ECUs over one or more CAN buses. The CAN messages are encoded to the J1939 standard so that ECUs from different vendors understand each other.
Manufacturers of cars, of lorries and of agricultural, construction and forest machinery have a tendency to maximise the number of ECUs. 20-30 ECUs in a harvester and 60-70 ECUs in a car are common. Firmware updates of the ECUs and communication between the ECUs become a nightmare.
Some manufacturers – with Tesla being the most prominent example – merge most of the ECUs and the infotainment system in one super ECU. Semi-autonomous driving and harvesting require inputs from many ECUs. The super ECU makes the communication between the “soft” ECUs faster and easier. The same is true for the communication between the teams developing the soft ECUs.
The border between Qt embedded systems and controllers becomes blurry for consumer devices. For a TV, the system and the controller (DVB-C receiver) are integrated in the same device, may be even on the same board. For streaming boxes, the border disappears, unless you regard the video decoding circuitry on the SoC as the controller.
Wired communication between system and controller happens over Ethernet, CAN, Modbus, RS-485 and many other physical links. Wireless communication links include Wi-Fi, Bluetooth, LoRa and Zigbee.
You find an even greater variety of protocols when you move up to the application layer. For example, J1939 and CANopen are two standards that define the application-layer communication over CAN bus. Many companies define their own proprietary protocols over CAN bus.
- How does the system communicate with the controller on the physical and application layer?
- Which standard protocols (including implementations) are available for the application layer?
- Do you have to define a proprietary protocol or can you extend a standard protocol?
- How can you generate the functionality for decoding and encoding messages automatically from the specification?
- Do you have to implement a communication framework that decodes messages into quantity objects, stores the quantity objects in digital twins of the ECUs, and encodes the quantity objects into messages? Or is one available off the shelf?
- How can you reuse the communication framework for a telematics unit, for an IoT gateway or for the embedded Qt application running on a PC, phone or tablet?
Harvesters easily have half a dozen video cameras also known as image sensors. One camera is trained on the cutting unit, one or two cameras show how the crop flows through the machine and three or four cameras are combined into a surround view of the harvester (e.g., when driving backwards).
Image recognition, whether a plant is a weed or whether grains are cleaned properly, is slowly finding its way into harvesters. GPS-guided autonomous driving is fairly well established. If the SoC of the Qt embedded system is powerful enough (including hardware acceleration for inferencing on deep neural networks), image recognition or GPS guidance can be integrated in the system. Otherwise, they come in their own hardware boxes (ECUs) and send their results via the controller to the system.
Sensors can measure nearly everything including temperature, humidity, gas concentration, air quality, smoke, motion, distance, acceleration, pressure, sound and flow. They send their values to the system over wireless links like LoRa, Zigbee, Wi-Fi and Bluetooth or over wired links like Ethernet, RS-485 and CAN. The system may act as an IoT gateway for the sensor values.
The questions for sensors are similar to the questions for controllers, because you can implement the communication from sensors to the system pretty much in the same way as the communication from controllers to the system. The other direction from system to sensors may be missing completely or may be very simple.
- Do you evaluate the sensor data on the Qt embedded system (e.g., speech recognition) or on a separate system (e.g., a specialised ECU) or even in the cloud?
- Which physical-layer and application-layer protocol do the sensors use to send their values to the system?
- Which standard protocols (including implementations) are available for the application layer?
- Do you have to define a proprietary protocol or can you extend a standard protocol?
- How can you generate the functionality for decoding messages automatically from the specification?
- Do you have to implement a communication framework for decoding messages into quantity objects? Or is one available off-the-shelf?