Start End

Review of The Lifecycle of Software Objects by

The Lifecycle of Software Objects

by Ted Chiang

4 out of 5 stars ★ ★ ★ ★ ☆

Reviewed .

Shelved under

Anyone else remember Creatures? I played that game when I was younger … I might still have it around somewhere in a closet. Hmm, maybe I should dig it out. Because The Lifecycle of Software Objects reminded me of Creatures (albeit without the breeding). The digients in Ted Chiang's novella are artificially-intelligent software programs who begin as a genome created by software developers. The genome is just a starting place, however, and more complex traits emerge as the digients learn from human interaction. The digients are all capable of learning human speech, and some can even learn how to read. Yet they all develop distinct personalities, influenced by their owners.

Blue Gamma, the company that creates digients, envisions them as a hugely successful brand of sophisticated digital pet. And they are—successful, that is—for a time. The fad peaks, and the company folds, leaving two of its employees, Derek and Ana, among a small group of hardcore digient owners. These people continue to run their digients full time (instead of "suspending" the digients indefinitely), paying them visits in their Second Life-esque digital environment and interacting with them in the real world through the use of robot bodies. The relationship between the digients and their owners is similar to that between a child and a parent, but there are some notable differences. For instance, digients lack physical bodies and the corresponding hormonal changes; digients do not undergo puberty. Instead, they continue to learn and change indefinitely. Yet attempting to apply a human metric for development, as Derek soon learns, will always be frustrating, because the digients aren't human.

This is a refreshing reminder. I often get frustrated with the way some science fiction portrays artificial intelligence so inconsistently. Take Star Trek: Voyager, for example. The Emergency Medical Hologram, or as everyone calls him, the Doctor, begins the series "integrated into the sickbay systems" (that's from "The Eye of the Needle"). Eventually he acquires some slick 29th-century technology that lets him leave sickbay and even Voyager itself. Every time the Doctor goes on such a mission, there is the risk that his program will be lost—but why? Later in the series ("Living Witness") we see a backup version of the Doctor, so either they started with the capability to backup the Doctor or developed it later down the line. Either way, it seems to me that this is an aspect of artificial intelligence that science fiction often sorely neglects for the sake of storytelling: it's easy to copy a computer program.

The awareness that the digients are nothing more than complex, evolved programs underlies The Lifecycle of Software Objects. Early in the story, as the flagship digients of Blue Gamma undergo their training, one of the executives learns that a digient has picked up a profanity from a trainer. So he orders a rollback, just to be safe. Rollbacks, as the term implies, remove all memories and experiences a digient has had since the date of the rollback, essentially changing them as a person. And one of the major problems Ana and Derek must overcome is that the Second Life-esque environment where the digients live, Data Earth, has become obsolete, isolating the digients from all their friends who move on to a more advanced digital world. The digients need their engine, Neuroblast, ported to this new platform, but the cost is prohibitive. Without the port, however, the digients are confined to a private, sandbox version of Data Earth, one that only their owners visit. It's not really a life; it's a prison sentence, and all because technology has begun moving on without them.

Even if their owners persist in seeing them as more human than they are, the digients hold no such prejudice. Oh, they want freedoms, yes; Derek's two digients, Marco and Polo, yearn to become corporations so that they can have rights under the law. One of the solutions proposed to fund the porting of Neuroblast is to prostitute some of the digients to a cybersex company. The company would train copies of the digients, modifying their reward maps so that they find pleasurable what their owners find pleasurable. It's a little creepy, and Ana and Derek are very uncomfortable with it. Marco, however, decides he wants to do it:

"…I don't think you understand what they want to do."

Marco gives him a look of frustration. "I do. They make me like what they want me like, even if I not like it now."

Derek realizes Marco does understand. "And you don't think that's wrong."

"Why wrong? All things I like now, I like because Blue Gamma made me like. That not wrong."

Marco is very comfortable with the fact that he is a creation of Blue Gamma, and he is just as comfortable with the idea that he is not unique, in the sense that his program can easily be copied and redistributed:

Derek feels himself growing exasperated. "So do you want to be a corporation and make your own decisions, or do you want someone else to make your decisions? Which one is it?"

Marco thinks about that. "Maybe I try both. One copy me become corporation, second copy me work for Binary Desire."

"You don't mind having copies made of you?"

"Polo copy of me. That not wrong."

Chiang's characterization rings true: as singular beings, I think we approach the idea of digital existence with some trepidation. If I can backup my mind elsewhere, and then I suffer an accident in this body, that backup can be downloaded into another body and activated. I will survive, but it won't really be me; it will be a copy of me. Since up until now there has only ever been one of us (or at least, that's the way we perceive it), our brains aren't really equipped to handle that kind of philosophical crisis. To the digients, on the other hand, it is natural. And I think this will be true of any artificial intelligence: it will have to come to terms with its existence as software and the fact that software can be copied.

The Lifecycle of Software Objects takes place over a deceptive period of time. It seems like almost every chapter begins with some form of "another year passes", so despite its length, at least a decade elapses over the course of the story. Initially, Ana and Derek focus on protecting their digients from external threats: people who would copy and exploit their digients, and the isolation brought on by the obsolescence of Data Earth. Yet eventually, they come to realize that this protection is all well and good but also stymies the digients' growth. One day the digients will want autonomy, and part of the progress towards that autonomy involves hard work and pushing the digients to explore their capabilities. I love the closing line: "'Playtime's over, Jax,' she says. 'Time to do your homework.'"

The Lifecycle of Software Objects is in an intriguing story about raising digital life. On one level, it is a fresh look at the tropes of artificial intelligence that are becoming increasingly common in our science fiction. It includes the realities of the contemporary technology sector—the deadlines, the capitalist goals, the replacement of existing platforms with newer, better ones that might not be compatible. Overall, it contains some very smart observations about AI and the development of technology, so colour me impressed. As a novella, it feels almost the perfect length. Chiang's concepts are amazing, but his characterization is definitely Lifecycle's weakest link: too often we are told how Derek and Ana feel instead of seeing it. Although I suspect Chiang could have fleshed out his concepts and their underlying themes enough to turn this into a novel, I appreciate his circumspection and elision. This is a story painted in very broad strokes, tracing two characters whose lives intersect in a myriad of ways, and the digital creations they both hold dear.


Share on the socials

Twitter Facebook

Let me know what you think

Goodreads Logo

Enjoying my reviews?

Tip meBuy me a tea