Monday, November 2, 2009

Lessons




After the first reading, at least for me, the question of whether androids had human rights seemed silly. I mean.. they are man-made robots, OK? They have no emotions, no sympathy, no empathy, no love. They can never be “affected by the condition of another with feeling similar or corresponding to that of the other” (Anthology, 274M) or be “moved by the suffering or distress of another” (Anthology, 274J). I couldn't understand how they could even be considered for human rights when they were so decidedly not human. For me, even though androids had human characteristics, even though they looked like humans, and acted like humans, and could imitate our mannerisms, there was no getting around the fact that... they are robots. They are not alive, and there seemed to me to be no point in treating them like they were. When Deckard was killing them off, it was just him “taking care” of a malfunctioning machine.


Though androids might look more like humans than this robot, they are still machines.

http://www.dailygalaxy.com/photos/uncategorized/2008/12/24/asimorobot_48.jpg

Except... then came along Luba Luft. I feel like Deckard sums up my feelings after her extermination perfectly: “so much for the distinction between authentic living humans and humanoid constructs” (Dick, 142). Well YEAH. SO MUCH FOR THAT DISTINCTION. I feel like that's exactly what Philip Dick was getting at with her death.


Is there a distinction?


In the movie terminator this character was full android but this picture illustrates well my idea that, with the introduction of Luba, the distinction between humans and androids becomes fuzzy because of the emotional response she evoked in me.

http://screenrant.com/wp-content/uploads/terminator2.jpg

I still stand by what I said last DB that there is a distinction, that a humans ability to WANT to feel empathy, to be acutely aware of the empathy they are lacking, is what distinguishes humans from other beings. But like Deckard, I really felt for Luba. And my feelings weren't anything like what Phil Resch telling Deckards his feelings were. I couldn't even see Luba, so my emotional response to her death definitely had nothing to do with “love toward... an android imitation.” For me, it was more that Luba, through her music and her slightly vapid, innocent personality, evoked from me an emotional response. I got kind of attached to her. And, to be honest, I was pretty mad when Deckard shot her. I liked her!


So anyway, then I got to thinking about some of the other technically non-living things I've gotten attached to. The list is actually embarrassingly long. However, probably the non-living thing I've been most attached to was my Simba stuffed animal. That guy was my bff from the age of 3 all the way to a terribly old age that is far beyond the appropriate age for being friends with a toy (coughOLDcough). And he was just... great, you know? I remember after one particularly horrific day at school, I got home and my mom was busy yelling at my sister for... I don't know, eating dirt or something, so she didn't have time to listen to my sad tale. Guess who was there for me? That's right. SIMBA. He listened to the whole story without interrupting me, and when I finished, he didn't judge me or try to tell me what to do. He just gave me a nice, big lion hug. How could I not get emotionally attached? He was such a good friend! And I would have been heartbroken had a bounty hunter come into my room and exterminated him.


A Simba much like this one was one of my best friends.

http://www.hermanstreet.com/store/media/img/00/48/51/07/0048510730012_100x100.jpg

So, I do think that androids should have human rights, not because I think that they have are like humans, or like animals, or really like any “living thing”. I think that androids should have human rights purely for the selfish reason that I could very easily become emotionally attached to... any of them. Of course, this creates problems. Andriods are not as cute and cuddly as my old Simba toy. They're not alive, and they have no empathy, and in Androids we see that they can be a danger to humans. But if we “suspend theory in order to focus on what the person of object in front of us might teach us”, we might realize that androids, despite all of their un-human flaws, may be able to teach us a higher form of empathy. I mean, it's easy to feel empathetic towards something that is like you. But with androids, I feel like we have to search more, and find that small sliver of likeness that connects us in order to be empathetic and emotional towards them.


Androids can teach us how to be empathetic and loving towards something that is quite unlike us.

http://hannahsworld.files.wordpress.com/2008/02/hannah-heart.jpg

In fact, I think I may have already begun to learn this lesson of empathy. I like those androids. I like them and it makes me very upset to think that they're being killed off for being exactly what we made them to be: unsympathetic, empathy-lacking machines.


No comments:

Post a Comment