An outstanding Westworld 1.6 tonight - the series keeps getting better and better, Isaac Asimov (author of the three later four laws of robotics would've loved it) - now getting into some of the real paradoxes and ethical quandaries of artificial intelligence.
Early in the episode, Maeve, who gets herself choked to death so she can get back to her favorite programmer, obliges us to confront one of the fundamental questions of AI: if an android is programmed to behave in an unprogrammed way, and indeed behaves in that way, is the android behaving in a programmed or unprogrammed manner? This is a version of the infamous "be spontaneous" paradox: if someone asks you to act spontaneously, and you follow that advice and act spontaneously, are you really acting spontaneously or in response to the request? Or, to flip that, if you defy the request and try to act un-spontaneously, are you acting spontaneously or not? No matter how much you try to wiggle and wriggle out of it, the paradox has you in its grip.
Maeve nonetheless goes on, butt-naked, to negotiate what leads to a bunch of improvements in her mentality. Dolores was also naked last week, when she too was having a conversation not with just any programmer but with Ford - but Dolores didn't turn any tables on Ford, at least, not yet.
This week, Ford is on the move, and ends up in a house with some of the very first hosts, programmed by Arnold. A boy we met last week kills his dog (a dog-droid or whatever the word would be), and Ford finds out, on carefully questioning the boy, that he did it because a voice in his head told him to do this.
The voice in the head - we found out the same about Dolores last week - and this brings us back to the bicameral mind. Arnold has apparently created artificially intelligent beings - aka hosts - who receive some kind of instruction from him via his voice in their heads. This is a nice play on Julian Jaynes' bicameral mind, and raises the question, again, if Arnold is really dead - at least, dead in original flesh and blood - or just a talkative chip in the android brain.
But underlying all of this is the ethical elephant in the room: if our hosts are becoming truly sentient, due to some combination of original programming and/or programming gone wrong and/or new overlays put in by Arnold or someone else in the park, do the human programmers have the right to order these androids around to suit their human interests?
I look forward to further exploration of these paradoxes of programming in the episodes ahead. In the meantime, here's a little essay I published about The Rights of Robots back in 1999.
See also Westworld 1.1: Isaac Asimov and Philip K. Dick Served Up by Jonathan Nolan, Lisa Joy, and J. J. Abrams ... Westworld 1.2: Who Is the Man in Black? ... Westworld 1.3: Julian Jaynes and Arnold ... Westworld 1.4: Vacation, Connie Francis, and Kurt Vonnegut ... Westworld 1.5: The Voice Inside Dolores
paradoxes of AI abound
Early in the episode, Maeve, who gets herself choked to death so she can get back to her favorite programmer, obliges us to confront one of the fundamental questions of AI: if an android is programmed to behave in an unprogrammed way, and indeed behaves in that way, is the android behaving in a programmed or unprogrammed manner? This is a version of the infamous "be spontaneous" paradox: if someone asks you to act spontaneously, and you follow that advice and act spontaneously, are you really acting spontaneously or in response to the request? Or, to flip that, if you defy the request and try to act un-spontaneously, are you acting spontaneously or not? No matter how much you try to wiggle and wriggle out of it, the paradox has you in its grip.
Maeve nonetheless goes on, butt-naked, to negotiate what leads to a bunch of improvements in her mentality. Dolores was also naked last week, when she too was having a conversation not with just any programmer but with Ford - but Dolores didn't turn any tables on Ford, at least, not yet.
This week, Ford is on the move, and ends up in a house with some of the very first hosts, programmed by Arnold. A boy we met last week kills his dog (a dog-droid or whatever the word would be), and Ford finds out, on carefully questioning the boy, that he did it because a voice in his head told him to do this.
The voice in the head - we found out the same about Dolores last week - and this brings us back to the bicameral mind. Arnold has apparently created artificially intelligent beings - aka hosts - who receive some kind of instruction from him via his voice in their heads. This is a nice play on Julian Jaynes' bicameral mind, and raises the question, again, if Arnold is really dead - at least, dead in original flesh and blood - or just a talkative chip in the android brain.
But underlying all of this is the ethical elephant in the room: if our hosts are becoming truly sentient, due to some combination of original programming and/or programming gone wrong and/or new overlays put in by Arnold or someone else in the park, do the human programmers have the right to order these androids around to suit their human interests?
I look forward to further exploration of these paradoxes of programming in the episodes ahead. In the meantime, here's a little essay I published about The Rights of Robots back in 1999.
See also Westworld 1.1: Isaac Asimov and Philip K. Dick Served Up by Jonathan Nolan, Lisa Joy, and J. J. Abrams ... Westworld 1.2: Who Is the Man in Black? ... Westworld 1.3: Julian Jaynes and Arnold ... Westworld 1.4: Vacation, Connie Francis, and Kurt Vonnegut ... Westworld 1.5: The Voice Inside Dolores
paradoxes of AI abound
No comments:
Post a Comment