top of page

Automated End-of-Line Testing of an Embedded Linux System: Application with GUI Using Python

Background

Recently, I was tasked with developing a software package (test-suite) for End-of-Line (EoL) testing of an embedded Linux system (ELS). The ELS hardware was made up of a single-board computer, a daughterboard, a wireless module, and other components, and had input ports for external sensors. The original thought was that some Python and bash scripts and ssh into the ESL will do the trick. I asked who was going to be the end user and the answer was “engineers on the manufacturing floor” and they were not talking about software developers or engineers.


As someone who does a lot of his developments on Linux and/or for [embedded] Linux, I feel sympathetic towards those who do not like Linux or are even scared of it. Linux is strange world and it is perhaps because the Linux kernel is designed after Linus Torvalds’s (creator of Linux) personally! Anyway, in order to make the test-suite more user friendly, I decided to take some additional steps, add a graphical user interface (GUI) and do more than ssh’ing. I also came up with a modular architecture for the test-suite so it could be potentially reconfigured and reused (after minimal changes) for testing similar embedded systems.


In this article, I will provide a high-level overview of the test-suite architecture, features and graphical user interface. I will briefly explain how the test-suite can be configured and used for EoL testing of an embedded system. I will leave how functional tests are defined and carried out for another article.


What Is End of Line (EoL) Testing?

Considering their complexities, testing manufactured embedded systems is crucial. EoL testing is to ensure that all components in the device under test (DUT = embedded system) including ports, board interfaces and modules function as intended. EoL testing is carried out in addition to other testing stages such as burn-in and general testing of individual hardware components prior to use in assembling new units. The reader is encouraged to refer to the literature including this book chapter ("End-of-Line – EoL Testing”) by Klippel (2011) for a more detailed industry definition.


Embedded Linux System: Device Under Test (DUT)

I cannot share the details of the embedded Linux system that was the target of the test-suite. Instead, I am going to tweak things as if it was developed to test this embedded Linux system (Fig. 1; developed as part of a research project). The ELS/DUT is loaded with a computer vision middleware and used mainly in agriculture (plant high-throughput phenotyping and precision agriculture). The hardware core relies on a single-board computer, a microcontroller-based daughterboard, touchscreen, imaging sensors, ultrasonic range finder, light sensor, wireless module (latest model only) and other components. It also accepts inputs from some peripherals such as an external microclimate unit over a serial port (SDI-12).



Figure 1. Embedded Linux system components

The targeted ESL was relatively simpler and included a single-board computer, daughterboard, wireless module, and a few ports (audio, RJ-45, USB, RS4-85). In addition to being self-powered using lithium-ion batteries, the ELS has an Ethernet port, which is used to provide both power (using a PoE injector) and network connection to the unit. Fig. 2 illustrates how the ELS, injector, network switch and local computer are wired.



Figure 2. Wiring and powering the embedded system


Using Emulated Sensors

To test sensor ports, one might be tempted to use actual sensors as part of the test setup. However, sensor malfunctioning, inaccuracies, etc might introduce new variables into the equation that will complicate the testing process. In my opinion, wherever and whenever possible, emulated sensors should replace physical sensors. Sensor emulation can be carried out in a very cost-effective way using a variety of microcontroller boards or even single-board computers if higher processing speed is needed.



Architecture of Test-Suite

Desired Features

The primary users of this test-suite were members of a manufacturing team who did not have any background in software engineering/development and minimal familiarity with Linux OS. They needed to use the test-suite to verify assembled units and their components were fully functional before shipping. Taking this into consideration, I decided my test-suite should have the following features:


  1. Has a graphical user interface (GUI) and is interactive

  2. Accepts user input (initials, serial number, etc)

  3. Runs automated tests

  4. Is modular, which allows for quickly creating new tests or modifying existing ones

  5. Allows for sequencing tests

  6. Generates and displays easy to follow test results (failed, passed, aborted, etc)

  7. Formats and logs test results locally and/or in an online database

  8. Can be integrated into other solutions

  9. Can test multiple DUTs at the same time



Client-Server Architecture: JSON-RPC over HTTP

The initial draft (first version) of the test-suite I developed was taking advantage of the APIs exposed by the ELS middleware. However, it posed some limitations in terms of tests and access to the hardware. To have full control, I decided to structure the test-suite into Client and Server software packages and make it independent of the middleware:


  1. Client: an application with a graphical user interface (GUI) that the user runs on their local computer.

  2. Server: a headless application running on DUT.


In this architecture, Client sends out JSON-RPC requests (function calls) over HTTP to the Server, and the Server will perform an action (i.e. test) accordingly and provides a response. Each JSON-RPC request message contains a method (test name) and may or may not include parameters (params) for performing a specific test. For example sending a “testAudio” request by the application results in playing a sound on the DUT. Similarly, a “setOsVolume” request (Fig. 3) will result in setting the DUT OS volume to maximum (“value”: 100).



Figure 3. setOsVolume JSON-RPC request

An advantage of this approach is that you can use Postman flows to verify HTTP requests (with bodies in JSON-RPC). The test-suite architecture is shown in Fig. 4.



Figure 4. Architecture of the test-suite for EoL testing of an embedded Linux system

In the case of ELS OS monopolizing local resources, the middleware needs to be efficiently stopped or one needs to flash the ELS with a different OS to accommodate the tests. Another approach, is to write an OS image compatible with system on a flash drive and boot the system from the flash drive. However, this might not be an option with an ELS coming out of a manufacturing line. Because the single-board computer is more likely flashed with an OS that lacks the required board support package.


Tools

To develop the test-suite, I relied on the following tools, standards and communication protocols (among others):


  1. Python and Bash

  2. JSON-RPC over HTTP (to invoke tests on DUT)

  3. SPI (spidev), UART, RS485, RS232, SLIP encapsulation, etc


I had prior experience developing GUI using Qt and C++ (example), so I decided to go with the familiar PyQt (Python binding for Qt) for GUI development. Regardless of the language, I consider the Qt framework a good choice, and this is my opinion after years of using MS Visual Studio for GUI development.


I developed and tested both the Client and Server packages on Linux (Ubuntu 22.04.2 LTS). I did my best to make the Client package OS agnostic. However, additional work (major) is needed to run the application on any other OS such as Windows or other Linux version and distribution.



Software Packages

Client Software Package

The Client software package is structured into the following parts and each part is comprised of at least one Python module:


  1. Run script

  2. GUI

  3. Test sequencer

  4. Test definitions

  5. Logger

  6. Configuration files


Run script

I added a bash script (run.sh) that is used to start the GUI after carrying out some system pre-checks to make sure all dependencies and required libraries are installed. If the script determines that any of the required packages are not installed, it will print a message with tips on how to install missing packages. This process can be automated, but I left it there.


Graphical user interface

The GUI is at the heart of the test-suite and contains classes, functions, variables and constants that define the application appearance and how it interacts with the user. The GUI runs in its own thread and other major tasks such as DUT tests run in separate threads. For multi-threading I relied on Qthread, pyqtSignal, Qmutex, etc.


Test sequencer

The test sequencer is the main module for sequencing EoL tests. I did my best to keep this module as simple as possible, in case another developer or an advanced user needed to move things around. The tests are categorized into hardware, function and manual groups for convenience.


Test definitions

This module contains all the functions and classes that define and implement various EoL test groups. Based on the design of the ELS and requirements of the project, I defined a variety of methods called functional tests to test ELS hardware components (and some software aspects). I also came up with some criteria (not easy) to be able to make reasonable Pass/Fail decisions. The tests range from fully automatic to manual with instructions provided by the application (Client) in the form of popup messages and console prints. The result of a test is provided in the form of Pass/Fail classification. If an automatic test fails, the reason is provided and logged. If a manual or semi-automated test fails, the user is asked to provide an explanation, which is recorded as well.


Logger

The logger module contains classes, methods and functions used to store tests results in JSON format. It also prints error and info messages to the console.


Configuration files

Configuration files store parameter for configuring EoL tests. Initially, there is going to be only a default configuration file. The user can modify the default config file and save under a new name (any name except the name taken by the default config file).


Server Software Package

The Client software package is structured into the following parts and each part is comprised of at least one Python module:


  1. Run script

  2. JSON-RPC Server

  3. Manager

  4. Test definitions

  5. Libraries


Run script

Similar to the Client software package, I added a bash script (run.sh) that is used to start the Server after carrying out some system pre-checks to make sure all dependencies and required libraries are installed. If the script determines that any of the required packages are not installed, it will print a message with tips on how to install missing packages. Ideally, the packages are installed only once and the modified OS image is flashed into DUTs. The Server scripts; however, can be easily updated by the Client.


JSON-RPC Server

This module implements contains required functions, classes and methods to implement a JSON-RPC server on DUT.


Manager

Similar to the Client, I implemented muti-threading on the Server side. As the name implies, the manager module manages the data that is shared among various threads. It also creates some shared objects accessed and used by other modules.


Test definitions

This module contains all the functions and classes that define and implement various DUT test groups. It also prepares and sends a response to the Client (in JSON format) after a test is carried out.


Libraries

To be able to communicate with and test some of the hardware components (e.g. wireless module), I created and added a number of Python libraries (example). I formatted one of the libraries into a class that could be run as a QThread to work in the background.


Graphical User Interface (GUI)

My favorite way to design GUI is tabular, so I organized the GUI into two tabs: Console and Configurations, and added a menu bar. The GUI components are explained below.


Console Tab

The Console tab (Fig. 5) includes buttons (on the left side) and two console windows for prints with Client and Server as the source, respectively. The application prints information, error messages, instructions, etc to the console. The application also uses popup messages and dialog boxes for interactions with the user.



Figure 5. Graphical user interface Console tab

The buttons in the Console tab have the following functionalities:


  1. Start. starts a configured testing sequence.

  2. Stop. stops running test sequence. Clicking this button will not terminate running threads and the user needs to wait until an ongoing process ends.

  3. Find MAC. finds the MAC address for the connected (wired) DUT.

  4. Find IP. finds the IP addresses for localhost and connected (wired) DUT(s).

  5. Find Device. finds the port name for any connected adapter (e.g. RS232-to-USB) or TTY device that exposes u-boot console.



Configurations Tab

The Configurations tab (Fig. 6) includes most of the options and required inputs for configuring DUT functional tests and running included utilities. As it can be seen, the configurations are categorized into different group boxes:


  1. Config File

  2. Group Configs

  3. Test Configurations

  4. Parameters: Serial ports

  5. Parameters: Thermal Thresholds

  6. User Inputs



Figure 6. Graphical user interface Configurations tab

As mentioned before, the application comes with a default config file (JSON format; Fig. 7). The user can change default configurations and then save them into a new config file (by clicking the ‘Save’ button) under a new name. The next time the application starts, it will automatically load configurations from the file created by the user. There can be multiple config files and use each with a different DUT. The ‘Default’ button loads the default configurations.



Figure 7. Configuration file in JSON format

EoL tests are categorized into the following groups for convenience and there is an option to Aggregate Test Results at the end of a test sequence:


  1. Simulation tests

  2. Hardware tests

  3. Manual tests


An entire test group can be disabled/enabled by unchecking/checking the related box. If the ‘Aggregate Test Results’ box is checked, the application (Client) will aggregate all the test results into a single JSON file. If there are multiple test result files for any given test, they will all be included in the aggregated file. ‘Simulation tests’ only work when emulated sensors are connected to DUT.


In the Test Configurations section, individual tests under test groups described above can be enabled/disabled. If the ‘Logged’ box is checked, the application will log test results for that specific test, otherwise it will only run the test (without logging).


The Parameters: Serial Ports section holds serial port settings for the rs485_com test using a RS485-to-USB adapter, and u-boot console. The latter is automated and does not need to be configured by the userThe Parameters: Thermal Thresholds holds settings for the thermal_check manual test, which is carried out to check the body temperature of the ELS when running at full power. The minimum and maximum temperatures need to be set once.


The application requires the user to enter the following information to start with DUT functional testing:


  1. Initials: user initials

  2. Co.: company name or location

  3. Version: DUT version

  4. SN: DUT serial number

  5. MAC: DUT MAC address

  6. IP (Localhost): local computer IP address

  7. IP (DUT): connected DUT IP address


If any of the fields are left empty, the application will prompt the user to enter missing information. Some of the fields are automatically filled.


Menubar

The menubar includes File, Actions, Tests, Tools, and Help menus. The Tools menu includes several utilities that allow the user to prepare DUT for functional testing. Items in the Tools menu are as the following:


  1. Create Image Chunks. Used to generate OS Image chunks out of an OS Image which has a large size and cannot be directly flashed into DUT.

  2. Flash OS Image. Used to flash OS image (chunks) that contain Server into DUT.

  3. Flash Daughterboard. Used to flash the DUT daughterboard with a new firmware.

  4. Update wireless Module Firmware. Firmware is provided by the manufacturer.

  5. Update Server. Used to transfer Server files from a host computer to DUT.



Functional Testing

Required Steps

To run functional tests on DUTs that come out of the manufacturing line, some preparations are needed. For example, manufactured DUTs need to be flashed with a modified version of the stock OS image that has all the necessary packages installed. In my case, I needed to enable spidev, so I created a device tree overlay and applied in U-Boot. In addition, the daughterboard and some of the modules might need software/firmware updates. The required steps to prepare DUTs for functional testing using the test-suite are as the following:


  1. Create a configuration file for connected DUT

  2. Create OS image chunks

  3. Flash OS Image prepared specifically for EoL testing

  4. Flash daughterboard

  5. Update wireless module firmware

  6. Transfer Server software package to DUT

  7. Determine MAC address for connected DUT and update the configuration file



Test Sequencing

The tests are currently sequenced as the following and will be executed one after another:


  1. test_1

  2. test_2

  3. test_3

  4. ...


To exclude any of the above tests, changing the configurations in the Configurations tab will suffice. To start a functional test sequence, the following steps are taken:


  1. In the Configurations tab, functional tests are configured and saved.

  2. In the Console tab, the Start button is clicked to start the test sequence. Prompts, info and console messages will guide the user through the rest of the process.

  3. A popup message at the end will provide a summary of the test results.


The test sequence can be stopped at anytime by clicking the Stop button in the Console tab. The application might need to finish active thread(s)/test before aborting the test sequence. No test summary is generated at the end should one decide to abort the test sequence.


Run Individual Functional Tests

As an alternative to executing functional tests in a sequence automatically, one may run individual tests as needed. The following steps are taken to run an individual test:


  1. Prepare and connect appropriate hardware (e.g. USB to RS-232 adapter) for the test.

  2. Find the test name in the list of tests under the Tests menu and click.

  3. Wait for prompts and messages that will guide through the process.



Accessing Test Results and Logs

Location of Test Results

By default, the test results and process log files are stored locally and can be found under ~/test-results (Fig. 8). The directory is structured in the following format:


~/test-results/<date>/<serial_number>/<test_name>/<UUID>.json
~/test-results/<date>/<serial_number>/<logs_client>/<UUID>.log
~/test-results/<date>/<serial_number>/<logs _client>/<UUID>.log

Figure 8. Test results and logs directories

Date is formatted as yyyy-mm-dd (year, month, day). The test_name refers to the actual hardware, software or function test (e.g. IMU_check). The logs_client and logs_server directories contain process log files for the Client (local computer) and Server (DUT) sides, respectively. Each JSON file has a unique name (UUID).


Contents of Logged data (JSON files)

JSON data files containing the test results are structured as shown (Fig. 9) and explained below. The JSON data files are designed to facilitate automatic extraction of information, data aggregation and uploading to an online database (if needed).



Figure 9. Snapshot of JSON data file structure (Python code)

general

This section contains UUID, test start (start_time) and end (end_time) times, user initials (user_initials), company/location name (company), DUT version (dut_version), DUT serial number (dut_serial_number) and DUT MAC address (dut_mac_address).


test_results

This section holds a variety of information on a given functional test including its name, test result (pass_fail), and error message (if any). Should a test fail, the message field (message) will provide a short explanation (auto or user input) of what went wrong, which could be used later on for identifying the root cause of the issue. The “name” field matches the name of the directory where it stores test result JSON file for specific test.


Aggregation of Test Results

If the user checks the ‘Aggregate Test Results’ box in the Configurations tab or clicks ‘Aggregate Test Results’ from the Tests menu, the application will automatically read all the test results for specified date and aggregate into a single file (example: ~/test-results/2021-07-11/1234567890/aggregated_results).


The user can pick a date using the date picker in the menubar. If there are more than one test result files for a given test, the application will include all in the aggregated test results. The structure of an aggregated file (Fig. 11) in terms of fields is very similar to an individual test result file. The general field does not include start and end times, because these fields are already included in the result for individual tests.


Process Log Files

The application generates and stores separate log files for all activities happening on both the Server and Client sides. These directories are named logs_client and logs_server and can be found in the same location as the test result directories (~/test-results/<date>/<serial_number>/). Log files are named using UUID and have log as extension. Log file includes all test steps, errors, etc to help with debugging the process should it be unsuccessful.



References

Klippel, W., 2011. End-of-Line – EoL Testing, In Assembly Line - Theory and Practice. Edited by Grzechca, W., IntechOpen. DOI: 10.5772/21037 (URL)

0 comments
bottom of page