Kris changed the final “3” in “hu.58.107323” to a “4”. Then he copied “langExcept(‘hu.58.107324’,1,0,0),” pasted it into seven other files and then entered “no i’m not busy you?” into a Gmail chat box and “they didn’t say much really” into a Facebook chat box.

He leaned back in his chair and propped his legs up onto his cubicle desktop. His legs were all smooth skin, but he could now see hairs returning to his feet, so he felt around the floor with his toes to find his sandals. On the Multiversant campus outside, he could see co-workers playing hacky sack. He just had to finish up this work on the Hungarian algorithm, then he could join them.

In the work order’s description, Kris could see that they were speaking strangely in Hungary lately, or at least their slang appeared strange when analyzed by someone who didn’t speak Hungarian. They were saying things similar to “when are you thinking?” for “what time are you thinking of?” These new vernacular phrases had been mapped to their analogues in ninety-four other languages, but they were anomalous enough to require a new function to decide when to use them.

In Gmail, Lucia, the girl he had just met, asked “what you workin on?” Kris responded “really want to know? inserting a function to map hungarian phrases to other languages.” She wrote “ha ha no se how you do that”.

He wasn’t always sure how he did this either. On the screen before him, Kris tried to find his place among thousands of lines of indented code. It resembled a family tree except that there were no familiar names. Instead of names, there were questions and instructions. To trace the line of descent through this tree, you had to take a scenario, answer questions about it and then follow the prescribed path. Kris tried the main scenario. Does the source language equal Hungarian? Yes, he muttered to himself. Is the document’s context vernacular? No. Is the current phrase’s context vernacular? Yes. Here he added “Does the current phrase have an exception in the target language?” After this point, the code needed modified to reference the new function. He got through that much. But then came considerations of user preferences, grammar-check and context-check, which all had their own algorithms. Just the order of the questions had an immense effect on the answers.

He could never visualize Multiversant’s entire process to translate even a short sentence. He just worked with small portions at a time. Everytime he looked at the algorithm, he had to retrace his steps. He could only alter it by making a change that he assumed was the right change, then processing the 780,000 test cases to watch for unintended consequences. His team had estimated that less than a hundred cases should be altered by this change. If more failed to output as expected, then they would consider the edits to be problematic. This didn’t mean that they actually were problematic—a hundred was only chosen because it was a round number. But no one else was sure how the translations would be changed until Multiversant spit them out.

Kris pulled his phone out of his shorts pocket. A text read “your boss’s boss is out here, come on out”. Kris finished typing a new function in the code. He responded “in a bit,” then sent the test cases through. The number of differences appeared: 1,639,488. Something blatant must have gone wrong.

His Gmail chat was blinking from Lucia. She wrote “crazy that i’m talkin to someone that can design a new brain ha ha. no se something good might come of it.” She didn’t speak broken English at all; she didn’t say things like “I no want that,” but she was always typing “no se” into every other sentence. Kris wondered why she’d say something like “I no see how something good might come of that”.

He looked back at the code and moved one branch of conditional statements down a node and reran the test cases: 342 differences. All right, he thought. This is going to work.

Soon, in Hungary, a few thousand people will try out slang that’s not even a year old and be astonished that the software understands it. They’ll probably think, as most people now did, that the software understood it because the software is “smart”. This irked Kris. Journalists were always writing about Multiversant as though the software designed itself. “In its short five-year life, Multiversant has learned to hear what we mean, even when we say something else.” More than a few journalists tried their hands at being visionaries: “A child hears bedtime stories and later goes off to become an author; Multiversant has heard all of our bedtime stories, in all of our languages, and has now gone off to author, in effect, a language of all languages.” Kris sometimes read these articles in one window while he was programming something that “designed itself” in another window.

When Kris looked at the application on his phone, he didn’t see artificial intelligence. He saw a row of buttons that used to do nothing. He saw a screen that once displayed the wrong name of the language you chose. When you programmed, you saw how wrong everything could be. Any aura of automation dissipated the moment you spoke into a test version of the software and, instead of a translation, were given your own words in your own voice to wince at.

These types of errors had the unmistakable imprint of a human being. Every change they made, with the intention of improvement, briefly fucked something up. But then someone—or hundreds of people: programmers, database administrators, linguists, product managers, regular users who were unknowingly testing it—would spend innumerable hours analyzing, replanning, debating, approving, reprogramming, testing, debugging. When, finally, it all worked, the version was locked down into a release. Then anyone who didn’t work on the application found the latest updates to be “nothing short of miraculous”. They felt the aura.

What Kris felt was craft. There was an immense gratification in craft, in human invention using other human invention to create something that elevated us all. Even in concept, the translation software was far more craft than aura. The application did not natively know that a phrase in Korean had the same meaning and connotations as a phrase in Portuguese. That approach, the semantic approach, had been tried and still provided sharply limited results. Instead, the application primarily worked through probability, through math. You took millions of human-translated documents and observed that, with a given Korean phrase, it was translated 86% of the time to a certain Portuguese phrase. You therefore had 86% confidence that using the same translation would be accurate. The odds could be increased when you factored in the surrounding context. Multiversant’s translations were so good because human beings had, one by one, done these translations.

Kris sometimes imagined the translation process performed on paper, which was actually possible if you felt like spending your life making billions of paper index cards of phrases in each language, along with manual rankings of their translations. You could do this and—if time and labor didn’t matter—it might be just as accurate as the electronic translation. But don’t do that, Kris thought. Don’t make a paper index. Don’t spend all that time. We built machines to do these chores because they were so dreadful to do by hand. If a million translators have already done the work, why have them do the same work over and over again?

He wanted to ask Lucia to get together later, but he looked again at her last sentence: “no se something good might come of it”. Was she saying she didn’t see what good could come of what he spent his days working on? He thought about that “no se”. He picked up his phone and said “no see” into it. In a Russian voice, “but behold” was returned. That wasn’t right. She only spoke English and Spanish. He tried “no say”. His phone displayed “Spanish” and translated “I don’t know”. He now saw what she was saying. She was saying “I don’t know; something good might come of that”.

 

Therese sat sunken in her chaise lounge watching the evening shade seep over the backyard. The day was turning away from them. They would soon have to go in for baths and homework protocals and more duties than Therese could fathom at the moment. Right now, she was very slowly considering the damp squish of grass and dirt between her toes while the last reaches of sun brightly blotted over her vision. She was silently paused in a state that needed no articulation, no translation.

Ben came sneaking up across the lawn, smiling and hunched down, as though no one could see him. Carey and Briana were at the top of the backyard’s slope, looking up from their phones. The girls were laughing already, before their dad had even said a thing. Ben quickened his pace and snatched one, and then another, of their phones. Holding them up, he commenced a light family comedy. The girls happily whined about the unfair theft and pretended they could only jump an inch from the ground to regain the phones. He could all but banish fun, Therese thought, and the girls would still greet his actions with merriment.

“You think this phone translates Briana?” Ben asked. “Let’s try it then: ‘I’ll pick the toys up after school’.” The first phone passed it to the other phone via another continent’s language. The other phone correctly retranslated “I’ll pick the toys up after school”. But Ben put one hand on his hip, tilted his head and gave the girls a look of exaggerated dismay. “That didn’t translate right! Because when Briana says she’ll pick something up, she really expects somebody else to pick it up.”

Briana desperately protested “Nooooo!” And Ben mimicked her tone while upping her eight-year-old’s attitude by several degrees of indignation: “Nooooo!

The phones, both left on, translated and retranslated: “No.” “No.”

Briana kicked the ground and grunted. Ben put on an emoticon of a sad face and gave a long, satirically pouty “ohhhh …”

The phones returned “oh”.

When Ben and Carey laughed at the lagging translation, Briana cried “This isn’t funny!”

Ben mimicked Briana’s sassy whimpering with nothing but “Na na-na nunny …”

Then Therese heard the software say something she hadn’t heard in many versions. It said “I’m sorry. I couldn’t understand you.” The software couldn’t understand what was plain to everyone else here. It wasn’t the meaning; it was tone that it missed. It wasn’t sensitive enough.

Ben said, “If you say you’re sorry, Briana, I’ll help you pick things up.”

Briana started to reluctantly accept this offer when the phones started in with “If you say you’re sorry …” She grabbed her phone back and muted it. “I know!” she said. “That’s rude to say it again like that.”

The all-knowing voice hadn’t realized it was being rude. It would have to be told this in order to realize it. Therese realized that this, then, was the software: the sound of omniscience, saying only what we’ve told it to say.