Google Translate can do many things. One thing it can’t do, according to a judge — help you provide constitutionally-required consent to a police officer.
A court in Kansas earlier this month ruled in the case of Omar Cruz-Zamora, a native of Mexico who was in the U.S. on a visa. He was arrested for possessing cocaine and methamphetamine after consenting to a police search of his car. His attorneys filed a motion to suppress the evidence. Why? Google Translate, they argued, didn’t meet the threshold of the Fourth Amendment, which prohibits unreasonable searches and seizures.
Here’s what happened, according to the court:
An officer, once he realized Omar spoke limited English, started using Google Translate on his laptop to communicate with him. Omar eventually revealed he had $7,700 in cash in his car that he was using to buy a car to take back to Mexico. The officer asked if he could search the car, typing that question into Google Translate, and then for emphasis using two fingers to point to his eyes and then the car.
Omar answered, “yeah, yeah go” and told the officer not to steal his money. He stood near the edge of the road while the car was searched. The officer eventually found 14 pounds of methamphetamine and cocaine.
The court found that the defendant was struggling to understand the officer’s questions at points throughout their interaction, even though he was using Google Translate. The ruling here seemed to turn on the fact that the court found — as is often the case with language translation software — such services have their limits and are best used as a limited tool, not as a replacement for a live translator and certainly not for an entire conversation.
Here’s why. If you go to Google Translate and type in can I search the car? The Spanish translation you get is: “¿Puedo buscar el auto?” But if you start with that Spanish sentence and translate that into English using Google Translate, the translation you get is “Can I find the car?”
It’s why the court noted that Google Translate often provides a “literal but nonsensical translation.” One that misses the context, in other words.
For this case at least, Google Translate was found to be missing the context needed to help someone consent to a search. It’s a pretty interesting reminder that as tools like these continue to proliferate and AI seeps into more of our lives, it’s still got a long way to go before it can boast human-level understanding about the world. And that, for now, things like these remain tools — tools that have remarkable capabilities but that can still need lots of refining and additions.