How we spent less time testing and improved accuracy

11 Nov 2025 - by Archie Hilton


At Sabre, we’ve tackled with the challenges of testing physical products. As with all software, testing often takes more time than development. This is especially true of embedded firmware, and even moreso of complicated embedded firmware.

The Manual Approach

We originally performed complete manual black-box firmware testing; somebody sat with the product and operated it according to a list of instructions, checking tickboxes as they went.

This takes a lot of time; a single test could take multiple hours just to get the device in the right configuration to actually perform the test. It was often months just to test a single version of firmware, with usually two or three people completely allocated to the task.

Some tests, such as testing hardware failures, are impossible to perform manually without equipment. To do this, we had to modify devices with switches and controls to let us temporarily “break” them. Sometimes those units would actually break, and we’d have to spend time repairing them, too.

Another major problem was testers gaining familiarity over time. Over time, testers begin relying on their own belief of how the product is meant to work instead of what is actually written down. Two people could perform the exact same test and disagree on whether it passed or failed.

This was not sustainable. We wanted a solution that would give us:

  1. Our time back: Stop humans wasting time on mundane tests so they can focus on creative ad-hoc testing
  2. Speed: Perform tests quickly
  3. Repeatability: Remove subjectivity from the test.

While we went through a few ideas internally, we ultimately landed on hardware-in-the-loop simulation.

Hardware in the Loop Simulation (HiL)

Take the microcontroller from the design and fit it to a custom board. Then, instead of attaching all the hardware peripherals (discrete components, i2c, analog, GPIO), connect them all to another microcontroller. This second “host” processor pretends to be all the various hardware peripherals that the “test subject” processor expects to see when running real firmware on real hardware.

The “host” can then receive instructions from an external source to manipulate hardware state while monitoring the response actions from the “test subject”.

Now that we have a driver board, tests are simply scripts written to manipulate the hardware interface and expect outcomes.

This option covers our three original major requirements: We have our time back; tests run on the simulators. Set them going, come back later. We have speed; our tests run as quickly as they can. We have repeatability; as long as the tests start from the same point, they will have the same outcome.

The time we now spend writing tests for the simulator is not only less than it used to take to test a single piece of firmware, but also repeatable on future versions at no added cost. This had led to tangible improvements in our firmware quality and delivery times.

Caveats

HiL testing was a worthy investment on all counts for the improvement in productivity we’ve received. That said, it cannot do everything. We’ve found particular struggles in certain areas.

Hardware simulation cannot replace manual testing. You should always validate that the firmware works with real hardware. Once that is checked, operational details can be tested with a simulator.

Hardware simulation has limits. There are tests which take more time and infrastructure to write, like complex user interfaces, that it’s quicker to test those parts manually. Tests involving user experience often also contain a subjective element - “is this nice to use?” - that a simulator will never pick up on.

A test’s result is only as valid as its content. Test coverage is still important, as is test scope. This isn’t necessarily a unique problem to simulator testing, but it was more easily managed with intelligent human testers who could raise concerns about the validity of a test itself. Now that the test executor performs the task exactly and without question, formal review processes are more important for ensuring that a test actually tests what is claims to.

Conclusion

Hardware-in-the-loop testing stopped us wasting time performing the same steps over and over, separating the thinking time from the doing time. We’ve been able to spend those hours saved getting things done.