Member-only story

4 Reasons Why Good Technology Goes Bad

And why the Responsible Technology Assessment can help counteract bad technology

8 min readOct 18, 2020

--

Driven by their vision to change the world for the better, most entrepreneurs set out with not just good intensions, but the very best of intentions. Indeed, most entrepreneurs start their journey with noble aims and resolute purpose, wanting only the very best for themselves as well as others. But as the old saying goes: the road to hell is paved with good intentions.

The moral of this proverb isn’t that we are doomed to create chaos no matter how good our intentions are. The moral is that despite our best efforts to solve pressing social issues and environmental problems, we sometimes find ourselves doing more harm than good.

This is very much the case with modern technology and most of the digital applications that we use today. Many technology solutions are built with the greater good in mind. Yet over time, businesses scale and grow, some solutions change and morph in ways that see them capable of inflicting varying degrees of harm on people (often customers), eventually impacting and changing society in ways the founders never intended nor anticipated.

There are many reasons why this happens. Underlying most of them is the fact that some elements of technology are inherently problematic. Algorithms, for example, are biased by default, simply because their function is to make decisions and provide outputs based on the data being fed into them. To address the possible risks and harms associated with technology, the following four areas of concern are necessary to consider.

1. Bias & Discrimination

In the U.S., courts in states like New York or California use risk assessment algorithms to predict the likelihood of a defendant recommitting crimes once they are released from jail. This likelihood is also referred to as a defendant’s recidivism rate.

One such tool is COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). The software was developed by a for-profit company and uses a set of scores derived from 137 questions to classify criminal defendants as either low or medium/high risk. Courts use the results from the algorithm to…

--

--

No responses yet

Write a response