When Software Fails, part 1
As quality assurance and software testing experts, often our biggest competitor is the choice for organisations to not test. More accurately, some organisations believe their team of developers is adequate to test the latest features rather than hiring test engineers. As software test engineers we believe that this thought process is misguided. Developers — whose job it is to create — do not make for good testers — whose job it is to destroy.
Part of our job as a software testing organisation is to educate other organisation in the understanding that software testing is a value-adding activity, that it is not optional, and it should be treated with as much importance as software development. After all, who wants to spend the time building a house, only to find out after they’ve moved in that it is going to fall over.
Not only does this lack of due diligence cost your company money but it’s also losing your company’s reputation in the eyes of its clients.
Reputation is hard won, but easily lost.
It is important you understand that we have not written this article to call out or blame parties for their software failures. It was written to educate on the dangers of not investing enough in testing software. As an investment, is exactly the best way to think of it. We all make mistakes and having another pair of eyes to spot them helps immensely.
The goal of this article is to raise awareness of the risks of not testing software properly by highlighting some high-profile and highly publicised software failures.
Boeing software errors could have doomed Starliner’s uncrewed test flight
The initial error that caused the mission failure was due to the internal timer being 11 hours wrong, resulting in thrusters being fired at the incorrect moment. This obviously is potentially disastrous during launch or in space. Eventually, the team responsible would go on to find another bug in the software that could have been equally disastrous.
In this case, mission control was able to take control, averting any major incident. However, the failure cost millions of dollars and set Boeing image as a serious contender for NASA’s Artemis program launch system provider back.
Excel: Why using Microsoft’s tool caused Covid-19 results to be lost
The current coronavirus pandemic is the largest public health and economic crisis that humanity has dealt with in living memory. An error caused by a developer of an infection reporting tool using an outdated file format resulted in 16,000 cases being underreported in the UK.
As you can imagine, this would give government officials wrong data that could have led to a relaxing of restrictions that further spread this pandemic and caused potentially life-threatening problems at a national level.
Insurance and society security data of millions of Dutch leaked due to expired domain name (in Dutch)
This one is in Dutch, but we can explain the jest of it. Due to a failure to re-register a domain name of a healthcare facility, sensitive information such as patient files and social security numbers were freely available to access. To make matters worse, the patients were children with psychiatric problems.
The Iowa caucus smartphone app disaster, explained
Amid concerns of foreign-state influences on US elections and the need to streamline voting, the Iowa Democratic Party paid $60,000 for the ‘IowaReporterApp’ to be created to help ensure security but to mostly make tallying votes easier. It did not work and the blame lays at the feet of the development company hired to create the app. They spent two months on the app and never tested it on the scale that it would be used at, stated wide. It is a long read, but worth it. It is quite cringe-worthy.
Dutch site’s zero-day hack: a lesson in vulnerability scanning
This link is a great example of how a bug became a vulnerability that allowed hackers access to information that could be was for blackmail. The bug meant that the older version of the vBulletin system was left vulnerable and one website running that was Hookers.nl. This is a forum for sex workers to talk with clients. In the Netherlands, prostitution is legal but a person cheating on their significant other with a sex worker is immensely powerful material for blackmail. Doubly so when they are governmental figures. Not to mention endangering the safety of the sex workers.
Data Leak at Housing Corporation Stadgenoot, 30,000 People Warned (in Dutch)
This story is a great example of the false sense of security that we place on our technical services providers, in this case, a hosting provider. Personal information such as names, email addresses, phone numbers, and house addresses was leaked, but fortunately more sensitive information such as Social Security Numbers was not. Unfortunately, the organisation had to register the leak in compliance with GDPR and their reputation could be affected by actions they could not control.
Dutch Consumers Association Discovers Leak at Payment Service Klarna (in Dutch)
Klarna allows users to make a post-purchase payment up to 14 days after purchase, the goal being to allow users to finalize payment after they have had a chance to try out the purchase. According, a recent report from the Dutch Consumer Association, as Klarna does not require passwords, it is possible to fill in the name, email address, and telephone number of another Klarna user and attribute the charge to them.
Some of the examples shown in this article demonstrate how often it is the tools or service providers we use that fail us, and it is almost always a case of human error which goes unnoticed. We hope that you can find some parallels in these examples with your own products and services and that we’ve encouraged you to take your quality assurance processes more seriously. If not, then hopefully our future articles in this series will encourage you to takes steps to protect your organisation’s reputation via testing.
Originally published at https://www.spritecloud.com on May 18, 2021.