Earlier today, Gizmodo obtained and published documents from the Tempe, Arizona police department that appear to Uber’s safety driver was streaming Hulu at the time when an Uber self-driving vehicle hit and killed a pedestrian. The documents are, oddly, the best-case scenario for Uber: It can blame the accident on a misbehaving human, rather than a shoddily-designed testing process.

Uber spokespeople speaking to Gizmodo already appear to be steering the conversation towards blaming Rafaela Vasquez, the operator in the Uber vehicle. Even if Vasquez was being fully negligent — and the documents do seem to show that — the details about Uber’s self-driving test program that have emerged since the crash show that the company bears at least as much responsibility for the fatality as any safety driver, negligent or not.

According to Gizmodo, police sent search warrants to YouTube, Netflix, and Hulu looking for Vasquez’s account usage around the time of the crash. The data “shows Vasquez was streaming an episode of The Voice called “The Blind Auditions, Part 5″ between 9:16pm and 9:59pm local time,” which correlates with the time of the crash.

A Tempe police detective also analyzed the video footage of Vasquez before the crash, which shows actions consistent with somone watching TV:

“She appears to be looking down at the area near her right knee at various points in the video,” the report reads. “During the 9 video clips, I found that the driver looked down 204 times with nearly all of them having the same eye placement at the lower center console near her right knee. One hundred sixty-six of these instances of looking down occurred while the vehicle was in motion.” Vasquez was appeared to laugh or smirk during moments when she was looking towards her knee, the report added.

Although no verdict has been made public, the evidence we know so far sure seems to show that Vasquez was watching TV and not giving the road her full and undivided attention, which is surely a factor in the crash.

Uber’s PR team is already swinging into action, giving a statement to Gizmodo that doesn’t explicitly point any fingers, but sure looks like the early steps of shifting the blame.

“We continue to cooperate fully with ongoing investigations while conducting our own internal safety review. We have a strict policy prohibiting mobile device usage for anyone operating our self-driving vehicles,” an Uber spokesperson told Gizmodo. “We plan to share more on the changes we’ll make to our program soon.”

But here’s the thing: even if Vasquez wasn’t watching The Voice, police documents also clearly show that Uber’s safety procedures were insufficient. As an NTSB report on the crash showed, Uber’s safety drivers were charged with monitoring the self-driving system through a tablet in the vehicle as well as monitoring the road, institutionalizing the same kind of staring-at-a-screen distraction that Vasquez was seemingly doing on her own.

The police documents also back up previous reports that Uber’s self-driving vehicles were unable to perform emergency braking, relying entirely on the human driver to intervene in a dangerous situation. There was also no way for the vehicle to warn the driver if it detected a dangerous situation:

“During the current development phase, vehicle operators are relied upon to perform evasive maneuvers,” one Tempe detective wrote. “I was not able to find anywhere in the literature that the self-driving systems alerts the vehicle operator to potential hazards or when they should take manual control of the vehicle to perform an evasive maneuver.”

If true, the allegations that Uber’s safety driver was watching Hulu are harrowing, and Vasquez will likely face prosecution. But we also shouldn’t let Vasquez be the scapegoat for an Uber self-driving test that by all accounts was insufficient and unsafe to be testing on public roads, regardless of who was behind the wheel.