This is the class blog for a University of Michigan course, SI 410 Ethics and Information Technology. The site is open for public reading but not for public commentary.
Friday, February 7, 2020
Is technology racist or is the person who invented it?
"Did someone blink?" I can clearly tell that this girl's eyes are open but the technology and sensors of the camera did not. Who's fault is that? The camera's the inventors?
David Hankerson in his article "Does Technology Have Race" talks about a very interesting point and problem about how some modern day technologies are performing in a a discriminating manner towards specific groups of people. For example, the soap dispenser detecting a white person's hand but failing to detect a black person's hand. Or and apple watch not being able to detect a color person's pulse. A camera constantly asking if people in an Asian family photo are blinking. And especially, Google detecting black people in photographs and categorizing them as "Gorillas" or "Apes".
What struck me from the article was reading about how there are ways to fix these racial issues and controversies that technology is bringing about. If people in India are able to make their sensors on technology more inclusive to all skin types then why can't America?
This point makes me lead to the assumption that no, technology is not racist nor the inventors of technology and algorithms are racist. The users and providers are.
Hankerson's article was meant to raise awareness to the fact that technology is making people feel excluded. As I agree that it is important to spread this awareness especially to those who do not deal with this issue, I also think it is important that the users and the companies who sell and provide this technology follow guidelines and rules to make the tech as inclusive as possible. Especially when there are countries who are able to make these changes.
So who's to blame? It is really hard to say because we are unsure of the true intentions of the inventors of the product or algorithm. Likewise, how do we blame an algorithm? That is why I believe that the ones who are in control of changing or teaching the algorithm or the people who have the power to change the technology should be put at some fault and should listen to the awareness and warnings like Hankerson is putting out.
Subscribe to:
Post Comments (Atom)
This post is an interesting read and covers a really important topic in the world of inclusive tech design. One thing I would add is a connection to one of the readings from the first three weeks of the class. For example, do any of Vallor's virtues apply to inclusive design? Did Gelernter consider inclusivity in his Second Coming manifesto?
ReplyDeleteYou pose an interesting question. It is definitely not the algorithm's fault since they are not making decisions technically, they follow a set of rules that were told to do. The inventors can be at fault but the intention wasn't the intention wasn't present so they may be responsible for it but not necessarily morally responsible for such malfunction. Maybe it doesn't fall on no one.
ReplyDeleteI would combine the third and fourth paragraphs together since they are talking of the same article you referenced at the beginning. Your blog was very informative but I feel that this type of medium is intended to be more colloquial and personal. I suggest you try to speak your mind out and write how you feel, talk about a time you were a victim of such malfunction and base your argument on those experiences. That'll give it a more relatable tone.
This is an interesting reflection of the current standards for artificial intelligence, particularly in the western world. Your argument is made especially compelling thanks to all of the examples of racist technology mentioned. Your stance is clear from the very beginning, which is also a plus. I think your argument could benefit from some more analysis of the standards of AI and the possible solutions we can implement to make it a more inclusive field. As of now, the article seems like a summary of the original article. Maybe try to make it more your own. Great job anyway!
ReplyDeleteI think you did a good job of giving an overview of the topic and tying it into the class reading for Hankerson. One thing that stands out to me in the opening section is that I would have wanted to see more links to sources for the examples you were talking about, such as the soap dispenser example, as I would have liked to read more.
ReplyDeleteIn regards to the topic, I wonder if people in India would intentionally make their sensors more "inclusive", or if it is simply that the people there have darker skin tones and thus the providers and inventors are simply responding to the target demographic. I am also not entirely convinced about your argument that the users and providers are the people that are racist and not the providers, and I think there should be more supporting evidence to back up this claim.
I like that you take such a strong side on this point. I think it's definitely a problem that technology only caters to a certain portion of people. One thing that I believe would make this even better is if you included some analysis of the other points of view. For example, what extra processes might a company (or country) have to go through in order to achieve a universally unbiased product? Is it worth it? That would make this even better. Also, there are a few stray letters that take away from an otherwise well-written post that I recommend removing.
ReplyDeleteI love how your post takes on the issue of race in technology, as many white Americans are able to simply assume that technology will work for them. While I will agree that technology cannot be racist, according Hankerson it can be biased, and I think a distinction between bias and racism would make an excellent addition to your post. As another addition, you also talk about how the intentions of the designer cannot be known, but Hankerson does provide examples of how major manufacturers have yet to address known biases, which might call these companies' priorities into question.
ReplyDelete