Technology innovations, such as apps and computer software, have the potential to change our lives in many ways. Developing these technologies is a lengthy process involving a variety of staff including software developers and coders, user-interface designers, and market research specialists. Along this creative journey, a large volume of data is collected, but what if the data is ‘negative’?
What happens when the product is shown to have harmful effects on users? This ‘negative’ data could be something that companies want to hide, manipulate, or even destroy.
A great example of negative data involves cigarettes. It is well-known that cigarette manufacturers hid the addictive effects of their product from customers for years, not wanting to harm sales. In fact, they made their products even more addictive than the native tobacco. The cycle of selling cigarettes and addicting customers created a cycle of profit that was more important than human health.(1) According to testimony from Frances Haugen, the 2021 Facebook whistleblower, Facebook is aware of its own negative data; that is, unethical algorithms that promote misinformation, hate speech, violence, and depression.(2) These user behaviours and harms are subject to mitigation by meaningful preventative and corrective action by Facebook, as they are the guardians of their technology.
Guardians exist at the organisational level; that is, the company as a whole through its Board and Senior Management. However, every employee should also be considered a guardian of their product’s integrity. Whether designing, analysing, managing, packaging, selling or training, each of these steps in a product’s history give an employee ownership of the product’s integrity. An ethical product does not mean it is without risk; however, it does require its benefits to exceed the risks, and those risks need to be disclosed to users in a comprehendible way so that they can make an informed choice about using or rejecting the product.
Algorithms and other product aspects require an ethical lens during the design stage and beyond. Accordingly, it is vital that technology companies view ethics as a corporate strength for themselves and their products. This philosophy will empower their staff at all levels to safely speak up when they identify ethical risks including consumer harm. Negative data will always be identified during the course of creating new technologies; the organisation’s reaction to negative data announces their corporate integrity.
If you’ve tried discussing your ethical concerns about a product or service at your company with your supervisor and have not achieved resolution, consider using your official whistleblower/speak up channel to request an evaluation of the matter. Globally, most regions of the world have legislation that facilitates whistleblower support and protection (e.g., confidentiality, retaliation security). Follow your regional disclosure guidelines as described in legislation (generally summarized in your workplace whistleblower policy). For advisory services about whistleblowing polices, platforms, and employee training, contact us at [email protected]