Florian Pester, Jana Förster · 3 min read

Fully Automated On-Hardware Testing

Shift-left testing is an approach to address the harms of late testing, such as contexts switch for developers. Cyberus follows the 'test early and often' strategy and fully automated tests are a crucial part of developer workflows. Testing low-level code on all supported hardware requires us to overcome the challenge of automating commodity hardware. Our flexible infrastructure empowers functional tests as well as long-term performance monitoring.

Shift-left testing is an approach to address the harms of late testing, such as contexts switch for developers. Cyberus follows the 'test early and often' strategy and fully automated tests are a crucial part of developer workflows. Testing low-level code on all supported hardware requires us to overcome the challenge of automating commodity hardware. Our flexible infrastructure empowers functional tests as well as long-term performance monitoring.

Test automation

The “shift-left” approach to testing is becoming more and more popular. Early feedback allows developers to fix problems while their context is still fresh in their minds. Quality Assurance teams can focus on valuable tasks, such as designing innovative and complex test cases and efficient improvements to the testing strategy.

Cyberus’ low-level engineers rely on on-hardware testing because we support a multitude of different machine types and each of them has its own quirks. The configuration is quite dynamic, as we often need to support new hardware generations when they enter the market. Engineers use our test automation frameworks to trigger tests on all machines with a single click. The system guarantees high throughput because different machines execute their tasks in parallel.

A sample test suite running on different machines

On-hardware test automation is also integrated into the CI/CD pipeline. Whenever even a single test fails, the changes are held and the engineer needs to fix the failures before the code can be merged. Once the fix is submitted, all the tests run again to double-check and avoid side effects.

Automating commodity hardware

Most engineering organizations already rely on test automation, especially in the application and web software domains. However, in the low-level and embedded domains, automating tests is often much harder and as a result less common.

At Cyberus, we face several on-hardware automation challenges:

  1. Tests should always start at a clean state - a powered off device. But our customers sometimes need off-the-shelf laptops without any way to power-cycle them remotely.
  2. We need log output from low-level software and serial output is the easiest way to get logs. Modern laptops often come with few ports and certainly without any serial ports.

Our answer to these challenges lies in an army of small helper devices that emulate power button presses and use Thunderbolt or internal m.2 ports to add serial capabilities so we get the output that is needed. We’ve also developed a specialized device that enables reliable remote access to the machines.

Adding serial output to a modern laptop via Thunderbolt

A selection of specialized boards built to automate off-the-shelf hardware (©️Franziska Kestel Fotografie)

Image: A selection of specialized boards built to automate off-the-shelf hardware (©️Franziska Kestel Fotografie)

Taking it further with benchmarks

Now that we have a system that can workloads on a set of different hardware, we can also make sure that our other metrics are going in the right direction. We will dive deeper into these topics in upcoming posts. Here’s a sneak peak: different benchmarks run on all our hardware. We use the results to assess and address potential performance issues early in development. Our engineering process treats a performance decrease just like a bug. We find it early and we strive to never ship it to our customers.

In order to give you an idea of our benchmarking setup, here’s a screen-shot of a macro-benchmark that runs for each build. It allows us to compare the performance of our KVM Backend for VirtualBox to the original VirtualBox software on different hardware configurations.

We also monitor performance trends over time. The following graph shows an older laptop’s performance over the past 90 days.

We’re here to help!

If you face a transition to more automated testing, need to adopt a shift-left approach, or have an interesting test automation challenge, we can help. Feel free to reach out via our contact form and get help quickly!

Share: