Reading Response #4
Views On Technology
O’Neil has influenced me in so many ways. She has opened my eyes on the way I view technology and has given me a lot of information that I would have never known. The idea of WMD’s is real and from all the research she has done, it is clear that they are a real threat to our society. Since the beginning of the book, I have been more aware of when my location services are on and which apps track my location, as well as who is listening to me. It still worries me that my social media pages know what I am thinking about though… Thanks to O’Neil, I now understand algorithms and how they can manipulate, control, and intimidate people if they are abused. I am more careful about what I click on and what I view on my web browsers, as well as with the ads that pop up on my social media feed.
I am definitely more skeptical about technology after reading this book, but I am glad I have been able to learn what I know now. O’Neil gave me a reality check from all go the stories she shared that I had no idea about and I am glad to be aware of what is happening out there.
Suggestions from O’Neil
The only way to overcome the WMD’s is to come together and tame them because the data is not going away. O’Neil suggests that we, “advance beyond establishing best practices in our data guild… and change our laws.” (O’Neil, 2016, p. 206). She says that the models, “should be open and available to the public.” (O’Neil, 2016, p. 214). And she goes on to explain that we already have the technology to do such things, it’s the “will we’re lacking.” (O’Neil, 2016, p. 214). WMD’s must be transparent. They should, “disclose the input data they’re using as well as the results of their target. And they must be open to audits.” (O’Neil, 2016, p. 218). This is extremely important because if the WMD’s are used in the correct way, they have the potential to be great, and studies have shown that.
The people that develop and deploy algorithms also need to take accountability. “They should accept responsibility for their influence and develop evidence that what they’re doing isn’t causing harm.” (O’Neil, 2016, p. 223). Doing this doesn’t mean that the algorithms should be forced to be open sources, but simply put the burden of proof on the company. Doing so will make the company more fair and accurate.
In other words, we need to make data more public so that we trust the algorithms. It has to be a group effect and “we must demand that systems that hold algorithms accountable become ubiquitous as well.” (O’Neil, 2016, p. 231). “We can’t afford to do otherwise.” (O’Neil, 2016, p.231).