Thursday, February 20, 2020

Snapchat Filters Don't Work?

If you have Snapchat on your phone, you have probably sent your friend a selfie using one of their filters, also known as lenses. The photo messaging app uses facial recognition technology so that users transform their face into a child-like version of themselves, become another gender, and more. Unfortunately, these filters are not so fun for everyone. Snapchat lenses have a hard time identifying women and darker-skinned people due to the bias in their facial recognition algorithms. Now, you must have many questions popping into your head right now. How is this possible? Is Snapchat broken? Are you saying Snapchat is racist and/or sexist?
This Snapchat issue can be explained by pre-existing and technical bias, which Phillip Brey discusses in his writing, Values in technology and disclosive computer ethics.  Pre-existing bias comes “from values and attitudes” the designer had before designing the application, while technical bias comes “from technical constraints or considerations.”
So, how does this relate to Snapchat? The technology world is a highly white-male dominated field. In this way, the engineers at Snapchat could be unintentionally designing their systems to be less diverse because it is what they are familiar with. This reflects the developer’s own bias, which is pre-existing bias. One engineer could affect many lives because the algorithms they create can spread bias on a massive scale to millions of users.
Atima Lui, an African American woman with a darker complexion, has trouble using Snapchat filters (as shown in the gif below!). She states that “if developers are racially biased, the technology they produce is likely to be racially biased, too.” If Atima Lui was not convincing enough, according to Joy Buolamwini’s research with the MIT Media Lab, studies show “that darker-skinned females are the most misclassified group (with error rates of up to 34.4%).” Unsurprisingly, these are the same demographics that are underrepresented in the tech industry. 
On the other hand, there may be technical bias if there are not enough diverse datasets to train the algorithm with. The facial recognition system may be trained significantly on datasets containing fair-skinned people because they were more accessible than minorities. This limitation of software tools was a consideration the company had to make. Should Snapchat let people know their technology does not work for everyone and continue to release new filters? Should they even release new technology with these flaws?
Although Snapchat lenses are a fun way to connect with your friends and family, they have flaws that are the result of the racial and gender bias in the tech world illustrated by Brey. As an underrepresented woman in the field of technology and a future computer scientist, I would like to be an advocate for "algorithm transparency." Next year, I will be taking the Machine Learning course at the University of Michigan and want to fight for more inclusion in this field after learning more about it. Ideally, these automated decisions should be fair, transparent, and inclusive for everyone, whether it is Snapchat filters or other uses of facial recognition technology.

1 comment:

  1. Hi Aashia,

    I read and commented on your previous post so it was nice to see the progress from the first. The slight changes to how and where you introduced the readings were good to create a better flow and you kept the good contrast between reasons behind the Snapchat issue.

    I liked the new pictures, but they could use captions to explain what they are versus just knowing they relate to the article because it's face recognition. The last paragraph was better as well as it wasn't just a call to action but more about how you specifically hope to fix this problem.

    ReplyDelete

Note: Only a member of this blog may post a comment.