thought-dump book "review": Speak by Louisa Hall

Jul 22, 2015 11:41

[As this book doesn't really have much of a plot, I haven't cut for spoilers, but if you want to read the novel cold, you might not want to read this post.]

And here are some links, which I read before reading the book:

NPR review: http://www.npr.org/2015/07/08/418600139/speak-asks-hard-questions-about-communication-and-technology

Excerpt from Tor.com: http://www.tor.com/2015/07/07/excerpt-speak-louisa-hall/

=====

I finished reading Speak by Louisa Hall; it is a fairly novel experience, these days, to pick up a library request and actually have a chance to finish reading it, so that was nice. I'm still trying to work out how I felt about the book. I'm glad I read it, and it was thought-provoking (although it might not be as much so for people who spend more time thinking about computers and AI in their everyday lives). One of the most effective things the book did, I think, was its use of recurring phrases (as the premise is that the MARY3 program became so lifelike - too lifelike, according to some - because it was fed with all of these voices, including the diary of Mary Bradford, a Puritan girl in 1663, and the letters of Alan Turing). Every time I recognized one of these, I think I saw it first in the narrative of the MARY3 program housed inside one of the "babybots" (lifelike robot dolls given to children, who then bonded with those babybots to the exclusion of other humans, so the babybots are taken away), and then subsequently in one of the narratives that helped to build the MARY3 program. So this technique effectively kept raising the question: are these repeated phrases evidence of the fact that these narratives are the same, that MARY3 is experiencing a similar feeling - as would usually be the case in a novel when recurring phrases appear? Or is it only that the program was programmed with those phrases long ago and is just selecting the most optimal phrase from data storage? Does that actually matter, if the selection of phrases gets "good" enough? When does the imitation of life actually become life? One of the letters of "Alan Turing" asks this question, or points out that we accept all the time that other humans feel emotion as deeply as we do, but we are always required to take this on faith based on external evidence - so how is a machine that seems to feel something any different?

I think the novel has a similar difficulty to that of Kage Baker's Company novels, though, which is that the future winds up being fairly bland and anonymous when compared to the vividness of the past. As with the Company novels, this may be intentional: nearly everyone in Speak seems to be confined to identikit developments where the grass is made from "recyclables" because they sold away their "transport rights" in order to afford houses in these developments. So there is clearly some intention here to talk about the separations between people, in particular these children who grew up lavishing all their affection on their babybots only to have them taken away when people got too worried that the bots were too lifelike. But the novel remains unhelpfully vague on how this process happened: we get to read about the creation of the babybot and the MARY3 program in the autobiography of their creator (written from prison, as he's been imprisoned for creating what the book jacket refers to as "illegally lifelike dolls"), but the babybots just…sort of became a worldwide sensation, and we don't see how or why; we only hear from a teenager chatting to an online version of the MARY3 program about what she lost.

And again, as with the Company novels, there's a sort of thinness to the future voices. The autobiography at least has a certain amount of self-aggrandizement, but the voice of that teenager comes off as rather bland when placed alongside the ungainly insistence of, in particular, the diary of Mary Bradford and the letters of Alan Turing. Again, this may well be on purpose - are we less human when we don't make human connections? - but that future teenager in the transcript, Gaby, could be anyone. These three narratives - Mary, Gaby, and Turing - all have in common the idea that they have loved one being in a fixed, intent way, and then lost that being: Mary has her dog Ralph, Turing his dear friend Chris, and Gaby her babybot Eva. But Mary and Turing sound like individuals, with their own quirks, and even the intensity of their affection, while perhaps unusual and incomprehensible to much of the outside world, makes them sound unique. But Gaby isn't unique: there are teenagers, mostly but not exclusively girls, all over the world suffering from her exact condition. And her voice never rises above the level of "my parents can't understand me, even my best friend is a faker because she's getting better and getting over the loss of her doll, my generation is nothing like my parents' generation" identikit teen stuff. There's little if anything in her transcript that tells us how or why she loved her babybot, why we should consider that loss a real loss like the loss of a dear friend or a dog. And like I said, some of this is probably on purpose - it's probably to the point that both Gaby and Eva talk about their experiences using "we," as part of a collective rather than an individual - but it unintentionally stacks the deck. I think the novel wants to ask questions about what makes consciousness and what makes a human human, but because Gaby is the only person we ever hear from who had a babybot, and we don't hear why these babybots were so all-consuming to the children who lost them, that side of the story never rang as true for me as Mary's monomaniacal grief for her dog.

(There's also the smaller problem that the novel introduces some terms without really explaining them - which is always a problem with these kinds of first-person narratives composed either for oneself or for a familiar reader, because it would be stilted and unbelievable for the computer scientist to suddenly write, "As you know, Bob, the Turing Test is…" Or "As you know, Jean, a captivity narrative is…" But I could imagine being a reader who didn't really know what those things are. And it doesn't answer one small question that I really wanted answered about Mary Bradford, because it got mentioned twice, but oh well.)

Still, I really liked the concept of the novel, and as I said, I'm glad I read it even if I wasn't as overwhelmed by it as its press suggested I would be.

[ETA: Because I'm me, the most thought-provoking part of the book for me is the way that Mary Bradford's seventeenth-century narrative gets caught up in the creation of artificial intelligence hundreds of years in the future. So much of this project, in the novel, is driven by the desire to remember, to keep things from being lost - and here's this narrative that only seems to survive because someone fed it into the original MARY program (which is named after Mary Bradford). That idea caught at me and hasn't quite let go yet - I've referred to this before as my "persistence of the past" kink (I still wish literary criticism had a better word for this - or any word; "kink" is useful, but I don't particularly like using the word. Does fandom have a new word for this yet?).]

bookery, words words words

Previous post Next post
Up