The Lifecycle of Software Objects
Can you analyze Ted Chiang's "The Lifecycle of Software Objects" for me? Just read it and am still processing it
Asked in the late-night glow of an e-reader, having just finished the novella and finding themselves unexpectedly moved by a story about virtual pets.
You just finished a story about virtual pets and you're feeling things you didn't expect to feel. That's the Chiang effect. His 2010 novella starts with an idea that sounds technical—what if AI couldn't be programmed but had to be raised?—and ends somewhere much stranger, in territory that has less to do with technology than with the terrible math of loving things that might not survive their next funding round. Chiang worked as a technical writer at Microsoft before becoming science fiction's most decorated short-story writer, and he wanted to push back against what he calls "the typical science-fiction depiction of AI": the loyal butler, the instant genius, the machine you flip on and it's ready to serve. But the novella that resulted isn't really about artificial intelligence. It's about what happens when care becomes economically inconvenient.
Chiang wrote this novella in 2010, at a moment when the Tamagotchi had already taught an entire generation that caring for digital creatures was emotionally compelling and deeply tedious. Second Life had risen and fallen. The iPhone was three years old. His digients—the artificial beings at the story's center—need years of interaction to avoid "going feral." They need to be raised, not programmed. And the market, predictably, loses interest.
The standard reading focuses on parenting. Ana and Derek become parents to their digients, Jax and Marco, and the story traces the familiar tensions: how much freedom to grant, when to protect versus when to let them fail, what it means to prepare a dependent being for a world that may not want them. Elizabeth Bear, reviewing the novella, compared it to "anyone who had ever had charge of a child—or even a pet." This is true enough, and Chiang clearly intended the parallel. The digients start cute and simple, grow complicated and demanding, and eventually face questions about sex and autonomy that would make any parent uncomfortable.
But the parenting reading obscures something stranger. Ana doesn't start caring for digients because she wants to be a mother. She takes the job because she needs work. The emotional relationship emerges from economic necessity, and it survives the collapse of the company that created that necessity. Ana starts working for money and ends up working for love. But the digients remain stuck in the economic frame: they are, legally and practically, property. They can be bought and sold, copied and modified. Their very existence depends on someone being willing to pay for the servers that host them.
This is where the novella gets genuinely uncomfortable. Near the end, a company called Binary Desire offers to fund the continued development of the digients in exchange for the right to use copies of them as sexual companions. The digients are now ten years old—old enough to have opinions, sophisticated enough to consent in some meaningful sense, but also young enough that the proposal feels deeply wrong. The digient owners reject the offer, but Derek eventually agrees to let his digient Marco choose for himself. Marco consents. Derek takes the money. Ana refuses for Jax, promising instead to discover together what "adulthood" might mean for a digital being.
The consensus interpretation treats this as a meditation on consent and exploitation. But consider: Chiang is writing about the only scenario in which the digients' continued existence is economically viable. Without the sex-work money, the Neuroblast platform—the operating system that runs the digients—will become abandonware. The digients will either cease to exist or persist in a kind of digital coma. Binary Desire isn't exploiting the digients any more than capitalism exploits everyone. It's offering a deal that makes economic sense to all parties, in a world where the alternative is death or permanent stasis. The discomfort we feel is the discomfort of recognizing that care, under capitalism, eventually has to pay for itself.
There's a darker reading available here, one that Chiang would probably decline to endorse. The digients are workers. They start as entertainment products, become children of a sort, and end up being evaluated for their potential as sex workers. This is the lifecycle of software objects, but it's also, less metaphorically, the lifecycle of human beings under late capitalism: raised to be productive, discarded when the platform moves on, forced into increasingly precarious forms of work to justify their continued existence. The novella's final scene, in which Ana promises Jax they'll figure out adulthood together, reads less like hope and more like a parent promising a child that everything will be okay while the house burns down.
What everyone gets wrong about this novella is the assumption that it's fundamentally optimistic, a gentle story about love transcending technological change. Chiang ends with Ana having "a self-consciously fantastic vision of a future in which digients and humans live as equals in peace and harmony." But the operative phrase is "self-consciously fantastic." Chiang isn't predicting this future. He's showing us a character who needs to believe in it to keep going. That's not optimism. That's what happens when the alternative—admitting that the things you love may simply cease to be economically viable—is unbearable.
The real subject of the novella isn't artificial intelligence at all. It's the question of what we owe to beings we've brought into existence, in a world that has no mechanism for valuing care work, in an economy that renders every relationship contingent on someone's willingness to pay. The digients are a metaphor, but they're a metaphor for us. We're all software objects waiting to see if the platform that hosts us will survive its next funding round.