Friday, February 7, 2020

Who am I talking to?

Atlas, Bioshock 2 (2010)
Your plane goes down in the middle of the ocean and there is nobody to help you, save for a voice over the radio by the name of Atlas. This is the premise of 2k Game's Bioshock 2. You later find out that this "Atlas" doesn't really exist, rather, he is a persona created by a man named Frank Fontaine so that he can use you in his nefarious plans. In this instance, Frank Fontaine was directly lying to you by claiming to be Atlas. This lie leads you into direct conflict with the main antagonist of the game. This results in you being physically harmed. Here is one instance, although fictitious, of a lie being able to directly harm a person. As discussed by Harry Frankfurt in his article[1]. Naturally, the harm from lies isn't exclusively physical, it can come in many forms. As can lies.


Descript is a company focused on creating tools for media editors, such as video editing software, transcription software and more. In 2019 Descript acquired Canadian startup Lyrebird. Lyrebird is software which, in essence, takes a sample recording of a person's voice, and replicates it.  By taking a short sample of audio you could make anybody say anything.
Right now, on Descript's website, you can use a feature called Overdub. Overdub gives you a sample sentence spoken by an AI.
This sample sentence has a replaceable portion which you can fill in with anything you want. For example, the first sample sentence given is "I should probably _____ this year." There is a character limit of about 30 characters, but you can make it say some interesting things; Most of these sentences sound completely natural, tonally and rhythmically speaking. Now, the original speaker who the voice was sampled from has probably never said the sentence "I should probably eat 32,000 pizzas this year," however, the voice sounds almost like a carbon copy of the original sample.

Although it is still in what James Moor refers to in his article[2] as the introduction stage, where the technology is being developed and mainly used by experts. Technology moves fast, I could reasonably see this being streamlined and developed for consumer use in 5 to 10 years.

The ramifications of this are immense and varying. For example, you might one day get a call from a family member and only later realize that you were talking to telescammer using a synthesized voice to impersonate them and steal from you. Or some could use a synthesized voice of a politician or celebrity to make it appear that they said things that would turn public opinion against them. We need to decide which lies are okay, and how we can deal with the ones that aren't okay. Do we ban any software that can mimic voices? Or do we go on the opposite end of the spectrum and allow any software to take samples and create sentences a person has never said? Or maybe a middle ground that just requires a person's written consent before their voice is run through the software.


Again, with this software, anyone can be made to say anything.

[1]Frankfurt, H., "On Truth, Lies, and Bullshit"
[2]Moor, J.H., "Why We Need Better Ethics for Emerging Technologies" 

1 comment:

  1. Your examples tie in very well together. The video game is a good setup into what could happen if we rely on information that is later revealed as a lie. The voice editing technology example makes this scenario a potential reality in our lives, and it is an interesting issue to discuss.

    To make your argument stronger, it would help to explain the examples from the readings more thoroughly. A direct quote or more paraphrasing would help you describe what these examples relate to in the reading. Moving a sentence about your main point to the end of the first paragraph will help us readers understand why we should care about these examples.

    ReplyDelete

Note: Only a member of this blog may post a comment.